In my web application(Spring based), i have simple layered architecture as Service->Manager->Dao->database. For logging purpose i want to log request coming to Service and then exit from Service at one go so that it is easy to debug issues. Otherwise logs contain various output from different threads intermingle with each other which is not easy to read. Is it possible with existing logging framework like log4j.
It is possible with any logging framework. You can use AOP to create a "logging" aspect around your service methods. Here is some example.
You can use Spring AOP to implement such logs. Here's an example:
#Aspect
public class LoggingAspect {
private static final Logger LOG = LoggerFactory.getLogger(LoggingAspect.class);
#Pointcut("call(* com.yourcompany.*.*(..))")
public void serviceMethod() {
}
#Before("serviceMethod()")
public void logMethodCalls(final JoinPoint joinPoint) {
if (LOG.isDebugEnabled())
LOG.debug("Calling method {} with args {}",
joinPoint.getSignature(), joinPoint.getArgs());
}
}
Just wire it as a Spring Bean:
<bean class="com.somepackage.LoggingAspect" />
<aop:aspectj-autoproxy/>
and calls to public methods of Spring beans in the matched packages should be logged.
Related
I am using Spring Boot with log4j2 - and I'd like to trigger a custom method in a '#Service' class when logger.error(...) is triggered.
For example,
#Service
public class Foo {
private final Logger logger = LogManager.getLogger(getClass());
...
public void doSomething() {
try {
...
}
catch (Exception e) {
logger.error("Error!", e); // When `error` is triggered...
}
}
}
// Other class
#Service
public class Bar {
#Autowired private NotificationService notificationService;
public void triggeredOnError() { // I'd like to trigger this method
this.notificationService.notifySomething();
}
}
I'd like to know this is possible in log4j2 with Spring Boot. The thing is, I just want to trigger the default method logger.error(...) since I don't want to change the default behavior of log4j2. I researched a bit - and filter or adapter might be the solution here, but I am not really sure how to achieve this. Please help me out!
While an appender would work as Mark suggests, I would implement a Filter. A Filter can be placed in four different locations in Log4j 2 and has the option of forcing the log event to be logged, forcing it to not be logged or just continue on with the normal evaluation of whether it should be logged. But a filter can always be configured with onMatch=NEUTRAL and onMismatch=NEUTRAL so that it really has no effect on whether the log event is processed but allows some other processing to take place. In addition, Filters are much easier to write than an Appender.
You can find a sample Filter at http://logging.apache.org/log4j/2.x/manual/extending.html#Filters
What you should not do in a Filter though, is use it as a way to write the log event to some destination. That is exactly what Appenders are for.
IMO the easiest way to achieve that is to create a special appender and in Log4j2 configuration associate it with a logger of your choice (or maybe with all the loggers if you want a “global” configuration).
Then you could use an “appender filter” to make an appender called only if its an error message.
The only potential issue is contacting the spring bean from log4j2 appender. Read this SO thread to understand how technically you can achieve that.
The benefit of this method is that you don’t change the framework but instead leverage the configuration options that it already provides.
I have a rest service developed using spring boot. I have imported a number of external libraries which have some interceptors. Is there a way to print the list of interceptors along with order in which they will be triggered?
You can inject all beans of given type (in that case you can use the org.springframework.web.servlet.HandlerInterceptor interface) in any of your components. So if you want to print(or do something else with) all interceptors you can do something like this:
#Component
public class SomeBean {
#Autowired
private List<org.springframework.web.servlet.HandlerInterceptor> interceptors;
#PostConstruct //not required, but you can use it to print at the app startup
public void printInterceptors() {
//TODO use this.interceptors
}
}
Also, I guess that Spring prints the interceptors on startup, maybe in the debug log.
Say I have the following route:
from(rabbitMQUri)
.to(myCustomerProcessor)
.choice()
.when(shouldGotoA)
.to(fizz)
.when(shouldGotoB)
.to(buzz)
.otherwise()
.to(foo);
Let's pretend that myCustomProcessor tunes shouldGotoA and shouldGotoB according to the message consumed from RabbitMQ.
I would like to unit test 3 scenarios:
A "fizz" message is consumed and shouldGotoA is set to true, which executes the first when(...).
A "buzz" message is consumed and shouldGotoB is set to true, which executes the second when(...).
A "foo" message is consumed and the otherwise() is executed.
My question is: how do I mock/stub the RabbitMQ endpoint so that the route executes as it normally will in production, but so that I don't have to actually connect the test to a RabbitMQ server? I need some kind of "mock message" producer.
A code example or snippet would be extremely helpful and very much so appreciated!
This is one way of putting together a suitable test.
Firstly define an empty Camel Context with just a ProducerTemplate in it:
<camel:camelContext id="camelContext">
<camel:template id="producerTemplate" />
</camel:camelContext>
I do this so that when I execute the test, I can control which routes actually start as I don't want all my routes starting during a test.
Now in the test class itself, you'll need references to the producer template and Camel Context. In my case, I'm using Spring and I autowire them in:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "classpath:/spring/spring-test-camel.xml" })
public class MyTest {
#Autowired
private ProducerTemplate producerTemplate;
#Autowired
private CamelContext camelContext;
In the test itself, replace the RabbitMQ/ActiveMQ/JMS component in the context with the seda or direct component. eg replace all JMS calls with a seda queue.
camelContext.removeComponent("jms");
camelContext.addComponent("jms", this.camelContext.getComponent("seda"));
camelContext.addRoutes(this.documentBatchRouting);
Now whenever you are reading or writing to a JMS URI, it is actually going to a seda queue. This is similar to injecting a new URI into the component but take less configuration and allows you to add new endpoints to the route without worrying about remembering to inject all the URIs.
Finally in the test, use the the producer template to send a test message:
producerTemplate.sendBody("jms:MyQueue", 2);
You're route should then execute and you can test your expectations.
Two things to note are:
Your transaction boundaries may change, especially if you replace JMS queues with a direct component
If you are testing several routes, you'll have to be careful to remove the route from the Camel Context at the end of the tests for that route.
It may depend what component you are using (AMQP or RabbitMQ) for the communication.
The single most important resource for sample code in Camel is the junit test cases in the source.
Two files that does similar things to what you need are these two, but you may want to look around in the test cases in general to get inspiration:
AMQPRouteTest.java
RabbitMQConsumerIntTest.java
A more "basic" way to make routes testable is to make the "from" uri a parameter.
Let's say you make your RouteBuilder something like this:
private String fromURI = "amqp:/..";
public void setFromURI(String fromURI){
this.fromURI = fromURI;
}
public void configure(){
from(fromURI).whatever();
}
Then you can inject a "seda:foobar" endpoint in the fromURI before your start the unit test. The seda endpoint is trivial to test. This assumes you don't need to test AMQP/RabbitMQ specific constructs, but simply receive the payload.
A good way to make software better testable (especially software that communicates to external stuff) is to use dependency injection. I love Guice and it is directly supported by camel.
(all this stuff will burden you with learning about dependency injection but it will pay very soon i can assure you)
In this scenario you would just inject "Endpoint"s. You pre-configure the endpoints like this (would be placed in "module").
#Provides
#Named("FileEndpoint")
private Endpoint fromFileEndpoint() {
FileEndpoint fileEndpoint = getContext().getEndpoint("file:" + somFolder, FileEndpoint.class);
fileEndpoint.setMove(".done");
fileEndpoint.setRecursive(true);
fileEndpoint.setDoneFileName(FtpRoutes.DONE_FILE_NAME);
...
return fileEndpoint;
}
Your RouteBuilder just inject the endpoint:
#Inject
private MyRoutes(#Named("FileEndpoint") final Endpoint fileEndpoint) {
this.fileEndpoint = fileEndpoint;
}
#Override
public void configure() throws Exception {
from(fileEndpoint)....
}
To easily test such an route you inject another endpoint for test not FileEndpoint but "direct:something". A very easy way to do this is "Jukito", it combines Guice with Mockito. A test would look like this:
#RunWith(JukitoRunner.class)
public class OcsFtpTest extends CamelTestSupport {
public static class TestModule extends JukitoModule {
#Override
protected void configureTest() {
bind(CamelContext.class).to(DefaultCamelContext.class).in(TestSingleton.class);
}
#Provides
#Named("FileEndpoint")
private Endpoint testEndpoint() {
DirectEndpoint fileEndpoint = getContext().getEndpoint("direct:a", DirectEndpoint.class);
return fileEndpoint;
}
}
#Inject
private MyRoutes testObject;
#Test
....
}
Now the "testObject" will get the direct endpoint instead of the file endpoint.This works with all kinds of Endpoints and generally with all Interfaces/ abstract classes and Apis that heavily rely on Interfaces (camel excels here!).
I have a Spring AOP aspect used for logging, where a method can be included for logging by adding an annotation to it, like this:
#AspectLogging("do something")
public void doSomething() {
...
}
I've been using this on Spring beans and it's been working just fine. Now, I wanted to use it on a REST-service, but I ran into some problems. So, I have:
#Path("/path")
#Service
public class MyRestService {
#Inject
private Something something;
#GET
#AspectLogging("get some stuff")
public Response getSomeStuff() {
...
}
}
and this setup works just fine. The Rest-service that I'm trying to add the logging to now has an interface, and somehow that messes stuff up. As soon as I add the #AspectLogging annotation to one of the methods, no dependencies are injected in the bean, and also, the aspect is newer called!
I've tried adding an interface to the REST-service that works, and it gets the same error.
How can having an interface lead to this type of problems? The aspect-logger works on classes with interfaces elsewhere, seems it's only a problem when it's a REST-service..
Ref the below Spring documentation (para 2) -
To enable AspectJ annotation support in the Spring IoC container, you
only have to define an empty XML element aop:aspectj-autoproxy in your
bean configuration file. Then, Spring will automatically create
proxies for any of your beans that are matched by your AspectJ
aspects.
For cases in which interfaces are not available or not used in an
application’s design, it’s possible to create proxies by relying on
CGLIB. To enable CGLIB, you need to set the attribute
proxy-targetclass= true in aop:aspectj-autoproxy.
In case your class implements an interface, a JDK dynamic proxy will be used. However if your class does not implement any interfaces then a CGLIB proxy will be created. You can achieve this #EnableAspectJAutoProxy. Here is the sample
#Configuration
#EnableAspectJAutoProxy
public class AppConfig {
#Bean
public LoggingAspect logingAspect(){
return new LoggingAspect();
}
}
#Component
#Aspect
public class LoggingAspect {
...
...
}
In my opinion what you are actually trying to do is to add spring annotations to a class maintained by jersey. In the result you are receiving a proxy of proxy of proxy of somethng. I do not think so this is a good idea and this will work without any problems. I had a similar issue when I tried to implement bean based validation. For some reasons when there were #PahtParam and #Valid annotations in the same place validation annotations were not visible. My advice is to move your logging to a #Service layer instead of #Controller.
I have a requirement like below:
We have 20 Message Driven Beans in our application, when ever message has been arrived for each bean I need to log that information to database about the message. I can add this to each bean, but I need to change each and every class for each.
Is there a way where we can add some Filter class which will be fired before execution of MDB onMessage method. So that, I can have one class for logging all the MDB messages.
In general you should use some AOP technique. In detail, EJB provides an easy way to apply interceptors on MDBs:
public class LoggingInterceptor {
...
#AroundInvoke
protected Object myInterceptor(InvocationContext ctx) throws Exception {
//do logging here...
return ctx.proceed();
}
}
#Interceptors(LoggingInterceptor.class)
public class SomeBean implements MessageListener {
public void onMessage(Message message) {
//....
}
}
Example taken from Configuring an Interceptor Class for an EJB 3.0 MDB.
To address your question from comment: you have access to InvocationContext inside an interceptor which exposes all required attributes:
String class = ctx.getMethod().getDeclaringClass().getName();
Message msg = (Message)ctx.getParameters()[0];
Note that you can even alter the parameter or use a different one in interceptor.
You tagged your message with ejb-3.0 and spring. In Spring AOP options are much more flexible, but the general idea still applies.