JUnit Tests for RabbitMQ - java

I am building an application using RabbitMQ with Spring: so far so good.
To define Unit Tests I am using JUnit targeting an external server.
What I wanted to find out is if there is a way to mock the RabbitMQ server to perform tests, and if there is more than a way, which is the best one.
I found some posts around but they were made in 2012 or even before... maybe there's something newer, easier and more effective !
Thanks in advance

I wouldn't try to mock the RabbitMQ server itself; instead, mock your publication methods, channel factories, and so on in order to emulate error conditions (and the happy path, of course). What happens when your FoozleEvent.publish method throws an IOError, for example?

We use mocking extensively for tests in the framework itself; explore the tests for ideas. It's not too bad on the RabbitTemplate side, but mocking for the listener container is more involved.
In some case, though, a real integration test is needed and in that case we use a JUnit #Rule to ignore the tests if there's not a local rabbitmq broker.

To mock RabbitMQ in the Java world, there is a library that I am building : rabbitmq-mock.
The purpose is exactly the use case you describe. You can simply replace the amqp-client ConnectionFactory and you will have most of RabbitMQ features out of the box, without using IO (no port binding is needed) and without startup time.
Simply add the dependendy in your pom.xml:
<dependency>
<groupId>com.github.fridujo</groupId>
<artifactId>rabbitmq-mock</artifactId>
<version>1.0.14</version>
<scope>test</scope>
</dependency>
Then you can use it through a replacement of the ConnectionFactory you provided through Spring configuration or that Spring-Boot have provided for you:
#Configuration
#Import(AppConfiguration.class)
class TestConfiguration {
#Bean
ConnectionFactory connectionFactory() {
return new CachingConnectionFactory(new MockConnectionFactory());
}
}
Hope this will help !

Another approach, instead of mocking the RabbitMQ server itself, is to mock the dependent service on the other side of the external RabbitMQ server. You can do this using an async API simulation/mocking tool.
For example you can use Traffic Parrot which can be run in a Docker container as part of your CI/CD pipeline.
Here is a video demo of how you can use the tool to send mock response messages to a RabbitMQ queue in an aysnc request/response pattern. There is also a corresponding tutorial available to follow.

Related

Is there a way to use JMS in a JUnit test?

I want to run a JUnit test with usage of JMS. Is it possible to have a JUnit test use JMS outside of an Application Server like JBoss or a CDI container?
Provided that sending and consuming the message is completely decoupled from JMS, you could mock it.
For example: You can have a class that implements an interface like "IMyClassSender". In real code (non junit), all this class does is submit the message to JMS. In junit, implement IMyClassSender with a class that takes the input and instead if submitting to JMS, it passes it to your consumer class.
Alternatively, if you are using active mq: http://activemq.apache.org/how-to-unit-test-jms-code.html
You may also reconsider using an application server for this - Arquillian (http://arquillian.org) allows for executing unit and integration test within a JavaEE environment of your choice - and manages the app server lifecycle on its own.

Amazon Kinesis + Integration Tests

I'm currently working on a series of web-services which we need to integrate with Kinesis - the implementation has been done, however we have a series of integration tests (our web-services are all using Spring Boot so we use the #WebIntegrationTest annotation on our test classes to start up a local instance of the server and then call our resources with a TestRestTemplate) which are currently trying and failing to connect to the real Kinesis.
Although in ordinary unit tests it's not a problem to mock out calls to the methods within the Kinesis library, we can't really do this in the integration tests as the whole application stack is wired up with Spring. For a few other things (such as OAuth2 and calls to our other web-services) we've been able to use WireMock to mock out the actual endpoints - what I'd really like to do is use WireMock in this fashion to mock out the call to the AmazonKinesisClient but I can't find any advice on how to do this.
Alternatively I have seen that some AWS components have test libraries written by third parties which allow you to run a local version of it (e.g.: DynamoDbLocal) but can't find such a solution for Kinesis.
Is anyone able to give me some advice on how to run integration tests with Kinesis?
It might already be too late to give the solution but I will add what my team has done to replicate AWS resources locally as we use a lot of Kinesis, DynamoDb, S3 and cloudWatch.
We have created wrappers around Localstack -> https://github.com/localstack/localstack that allow us to spin up local instances of the necessary services as docker containers using docker-compose.
A typical docker-compose.yml file for us looks like:
version: '2'
services:
localstack:
image: "localstack/localstack"
environment:
- SERVICES=kinesis,dynamodb,cloudwatch
ports:
- "4568"
- "4569"
- "4582"
Then during the setup phase for the integration-tests, our wrapper fires up docker-compose up and runs the tests against the local infrastructure.
Later during tear-down, the wrapper kills the containers.
I ran into the same issue and the only mock implementation I found so far was a nodejs one: https://github.com/mhart/kinesalite
It does the trick - I managed to run my Java Kinesis client against it, just had to set the endpoint on the kinesis.properties:
kinesisEndpoint=http://localhost:4567
The downside is that it is not trivial to use it during build time tests - need to figure a way to start the mock kinesis before the test (using a maven plugin or something), didn't get to it yet..
Just a small addition to the existing answers. BTW, they are great, you should really use tools like localstack to start fake AWS services before the test during the test phase.
If you're using JUnit 5 in your tests, your life could be even simpler with JUnit 5 extensions for AWS, a few JUnit 5 extensions that could be useful for testing AWS-related code. These extensions can be used to inject clients for AWS service mocks provided by tools like localstack. Both AWS Java SDK v 2.x and v 1.x are supported:
#ExtendWith(DynamoDB.class)
class AmazonDynamoDBInjectionTest {
#AWSClient(
endpoint = Endpoint.class
)
private AmazonDynamoDB client;
#Test
void test() throws Exception {
Assertions.assertNotNull(client);
Assertions.assertEquals(
Collections.singletonList("table"),
client.listTables().getTableNames().stream().sorted().collect(Collectors.toList())
);
}
}
Here, client will be just injected in your test class and configured according to the Endpoint configuration class.

How to automate Kafka Testing

We have developed a system using kafka to queue the data and later consume that data to place orders for users.
We have tested certain things manually, but now our aim is automate the process.
Is there any client available to test it? I found out ways to Unit test it using kafka client itself, but my aim is to test the system as whole.
EDIT: our purpose is just API testing i.e., just the back-end, not the UI
You can start Kafka programmatically in your integration test, Kafka uses Zookeeper so firsly look at Zookeeper TestingServer - instance of this class creates and starts the Zk server using the given port.
Next look at KafkaServerStartable.scala, you have to provide configuration that points to your in memory Zk server and invoke startup() method, here is some code:
import kafka.server.KafkaConfig;
import kafka.server.KafkaServerStartable;
import java.util.Properties;
public KafkaTest() {
Properties properties = createProperties();
KafkaConfig kafkaConfig = new KafkaConfig(properties);
KafkaServerStartable kafka = new KafkaServerStartable(kafkaConfig);
kafka.startup();
}
Hope these help:)
You can go for integration-testing or end-to-end testing by bringing up Kafka in a docker container. If you use Apache kafka-clients:2.1.0, then you don't need to deal with ZooKeeper at the API level while producing or consuming the records.
Dockerizing Kafka, and testing helps to cover the scenarios in a single node as well as multi-node Kafka cluster. This way you don't have to test against Mock/In-Memory Kafka once, then real Kafka later. This can be done using TestContainers.
If you have too many test scenarios to cover, you can go for Kafka Declarative Testing like docker-compose style, by which you can eliminate the Kafka client API coding.
Checkout some handy examples here for validating produce and consume.
TestContainers project also supports docker-compose.
As I understood you want to implement end to end tests starting from messages. Me and some people from recently made a research for libraries, tools and frameworks to test Event-driven systems using Kafka.
We found Zerocode which is an automated API testing using declarative language like JSON or YAML. It support REST, SOAP and what we are interested, Messaging. It sends and consumes messages from topics and make assertions in the end, easy to learn and use. Here is the link for more details Zerocode. It seems like a good option although we are starting using it.
You will need to have Kafka brokers and the dependencies running to make this solution to work, but nothing like a docker compose and/or some scripts to bring a environment for tests.
Another way is to implement your own project with Kafka libraries and use the libraries to send and receive messages in the tests.
Unfortunately we couldn't find more options available out there. Kafka has a proposition to create a test kit but it's not in progress yet.
Unfortunately, the approach described by Pavel does not work for Kafka 2.8+ anymore. However, I could make our end-to-end tests with Kafka 3.2 work using the approach taken by KarelDB:
Properties props = TestUtils.createBrokerConfig(
brokerId,
zkConnect,
false,
false,
TestUtils.RandomPort(),
noInterBrokerSecurityProtocol,
noFile,
EMPTY_SASL_PROPERTIES,
true,
false,
TestUtils.RandomPort(),
false,
TestUtils.RandomPort(),
false,
TestUtils.RandomPort(),
Option.<String>empty(),
1,
false,
1,
(short) 1
);
KafkaConfig config = KafkaConfig.fromProps(props);
KafkaServer server = TestUtils.createServer(config, Time.SYSTEM);
// `createServer` will also start your Kafka server.
// To shutdown:
server.shutdown();

Manually setting up Spring WebSocket STOMP support

I'm trying to set up a STOMP WS endpoint using spring-websocket and spring-messaging. I am trying to do this manually: no application context is involved at all, and certainly no dispatcher. My goal is to wire up the appropriate Spring components in code inside a ServletContextListener, then register the wired up components directly with the javax.websocket.server.ServerContainer in my JSR 356 compatible container (Tomcat 7). At first, I would like to get this working with the "simple" broker built into spring-messaging; secondly, I would like to implement my own "broker" to directly integrate with an in-process ActiveMQ using the VM transport. This would be in contrast to the STOMP relay which spring-messaging also provides.
The Spring documentation states (http://docs.spring.io/spring/docs/current/spring-framework-reference/html/websocket.html):
"...Spring’s WebSocket support does not depend on Spring MVC. It is relatively simple to integrate a WebSocketHandler into other HTTP serving environments with the help of WebSocketHttpRequestHandler."
However, I am not finding it to be simple. Essentially, I started with:
public void contextInitialized(ServletContextEvent sce) {
ServerContainer websocketContainer = (ServerContainer) sce.getServletContext().getAttribute("javax.websocket.server.ServerContainer");
???
websocketContainer.addEndpoint(???);
}
And ended up with an incoherent mess of assorted spring-websocket and spring-messaging constructor invocations which do not compile and are certainly not worth reproducing here.
I realize this is a bit vague, this is because I'm a bit lost! Has anyone done something like this, or has some general guidance to contribute?
Did you try sample application - tests for the stock portfolio. (This link is at the very end of the spring documentation link).
It says
Demonstrates 3 approaches to testing a Spring STOMP over WebSocket application:
Server-side controller tests that load the actual Spring configuration (context sub-package)
Server-side controller tests that test one controller at a time without loading any Spring configuration (standalone sub-package)
End-to-end, full integration tests using an embedded Tomcat and a simple STOMP Java client (tomcat sub-package)
See the Javadoc of the respective tests for more details.
Second option is probably what you need.

What is the best way to write a test case for JERSEY web services?

I have a JAX-RS web service implemented with Jersey library and now I want to test it. In order to do that I'd like to host this service in my test by preinitializing it with mocked services.
What is the best way to host such a service and execute the test calls?
#Path("/srv")
public class MyService
{
#GET
public void action(#Context UriInfo uri)
{ ... }
}
#Test
public void myTest()
{
MyService service = new MyService();
service.setSomething(...);
// How do I host it?
// How do I call it?
}
The new (revised) Jersey Test Framework which is part of the Jersey 1.1.2-ea release now supports the In-Process or In-Memory testing. In order to run your tests in-memory all you have to do is set the property test.containerFactory to com.sun.jersey.test.framework.spi.container.inmemory.InMemoryTestContainerFactory, i.e., run your tests as follows:
mvn clean test -Dtest.containerFactory=com.sun.jersey.test.framework.spi.container.inmemory.InMemoryTestContainerFactory -DenableLogging
For more details please go through the blog entry titled Jersey Test Framework re-visited! at http://blogs.oracle.com/naresh.
I believe the Jersey Test Framework provides a solution for your requirement. It allows you to deploy a single service, and run all its tests. You could use the framework to run your tests against Grizzly Web Container, Embedded GlassFish and/or HTTPServer.
Please note that you could use the framework to run your tests against the regular web containers like GlassFish and Tomcat too. In case you have any more queries please feel free to send me or the Jersey users mailing list - users#jersey.dev.java.net an e-mail.
I haven't tried it, but a JUnit extension like HtmlUnit or HttpUnit may be a good way to test a JAX-RS/Jersey service. The test case can use XPaths to find expected return values and validate the returned value against the expected. See: http://htmlunit.sourceforge.net/gettingStarted.html for more info.
You can use Grizzly to host the services and then use the Jersey Client to access them. Take a look at the sample applications. For example, in the Bookstore sample you may find the TestSupport class and JerseyTest class (found in the jersey-test-framework) of particular interest.
I hope this helps.
(Unfortunately Stack Overflow wouldn't let me post until I removed all the hyperlinks so happy Googling!).
Okay I get it now. Right now the framework doesn't support IN PROCESS, bt we are working on it.
We will see that this support would be added in a coming version of the Jersey Test Framework
Have you looked in to using the Jersey Test Framework? Unfortunately it's still more integration test than unit test, but it might get you on your way.

Categories

Resources