Should integration testing of DAOs be done in an application server? - java

I have a three tier application under development and am creating integration tests for DAOs in the persistence layer. When the application runs in Websphere or JBoss I expect to use the connection pooling and transaction manager of those application servers. When the application runs in Tomcat or Jetty, we'll be using C3P0 for pooling and Atomikos for transactions.
Because of these different subsystems, should the DAO's be tested in a fully configured application server environment or should we handle those concerns when integration testing the service layer? Currently we plan on setting up a simple JDBC data source with non-JTA (i.e. resource-local) transactions for DAO integration testing, thus no application server is involved....but this leaves me wondering about environmental problems we won't uncover.

Besides testing each module using unittests, the integration test should test groups of modules.
I don't want to be pedantic but in therorie this is folowed by system test for black box testing by QA.
For smaller projects this may not be feasible

I think you're on the right track with this line of thinking. If possible you should set up a continuous integration server (e.g. Hudson) that runs your production environment. That way you can develop with pretty high confidence using Tomcat etc., running tests against your local setup, and when you check in your code be sure that those same tests are being run against the real deal.

Related

how to prevent jdbc from trying to connect to a mysql database during unit testing

I'm making an application for a school project, but I'm running into the issue that when I try to run the unit tests that it tries to connect to the database while starting up the application, which isn't required for the tests (because it will be mocked), and is not available in the CI/CD pipeline.
jdbc connection error
I'm building my project in Java Maven Springboot and would like to know how I can prevent it from trying to connect to the database when running my test.
here is a link to my repository: https://gitlab.com/kwetter_jack/Kwetter_posts/-/tree/ci_cd_setup
Your test classes have #SpringBootTest annotation which will start a Spring application context - as your application uses a database the tests will also try to setup and use a database connection.
The simplest solution is to remove the annotation so the tests no longer try to connect to a database. You'll probably need to mock some more dependencies as a result as Spring is no longer creating these for you. You could also have a look at https://www.baeldung.com/spring-boot-testing for some other ideas how you could alter your tests.
Alternatively if you do want / need the application context to run you can add a application.yaml for the tests that defines and uses a in memory DB so the tests have something to connect to - see https://www.baeldung.com/spring-boot-h2-database for details how to do this.
Just change value under spring.datasource to H2 database to prevent
The application connect the real database.
Test application.yml
FYI, You no need to copy all config from original application.yml, just only some config that you need to override.
while I was investigating the spring boot H2 in-memory database (as suggested by Chris Olive and Paranaaan) I also came across the option of using a test container. after looking into this I saw that this enables the project to create a temp docker container with a MySQL image that I can use during the testing of my project, considering I was planning on using docker anyway for the integration testing of my microservices project I attempted this and it worked as I had hoped it would.
if anyone is interested in the test container solution that I used, the information can be found here:
https://www.testcontainers.org/modules/databases/mysql/

spring boot vs tomcat

We currently running an app on production on tomcat 8.0.44.
We migrated our app to work on spring boot v2.0.4 (embedded tomcat) and now we need to upload it to production (instead of the tomcat implementation)
Could someone share his experience regarding this transfer?
Any impact noticed? All aspects (load, performance, etc), any downsides?
What should we consider when doing this transfer? (load test is problematic due to the nature of our application)
Also, I understood that spring boot can run as executable (currently not implemented on QA). What are the benefits of using it? is it preferable as to using java to run spring boot (the regular way)? any downsides?
Thanks,
Hila

Running Spring boot locally with a MySQL container as dependency?

After some radical changes to our schema and reading some posts on why you should avoid in memory databases.
We have decided to use MySQL locally for testing and developing. Using a MySQL docker container with a volume for persistence.
This is fairly straightforward however the issues we are having are the following:
Requires the container to be executed separate from the spring boot application (a manual task docker run
Same goes for stopping the container, its a independant process
My question is essentially, is it possible to have spring boot (when using a dev config profile) to manage this docker container.
i.e. I start development work in IntelliJ and run the service, the service checks if the container is running, if not starts it up.
If this idea is bad, then please let me know.
For testing its not issue, because we are using a maven docker plugin to create the container during the maven lifecycle.
Its more for devs working locally, and getting the service running locally with ease.
Any suggestions welcomed!
Bonus for Intellij setup!

Writing Java/Maven integration tests using webservice database and jms

I'm maintaining a Java Enterprise Application that exposes a few webservices and interacts with a few JMS queues and a database.
I've got most of the code unit tested and I thought all was good until in a refactoring I moved a class in another package and that changed a namespace in my webservice without me noticing and breaking all clients.
An Integration Test would have caught that, so I'm trying to write one.
The application is deployed on JBoss EAP 6.4, how can I make a webservice call in my mvn verify step?
What about JMS Queues? They are configured on the Application Server.
Is the application supposed to be already deployed?
Am I supposed to deploy the application with maven to a JBoss installation before the verify step or start an embedded webserver?
Most of the docs around are confusing to me and often I see suggestion to mock stuff, which is not integration testing (and I already do in Unit tests).
Someone told me "just use Jenkins" and I read the docs, installed it and still don't understand how that is supposed to help me write integration tests since all it does is run mvn verify anyway.
This topic and is too broad, there might be many different correct answers for this question and will depend a lot on the technologies you're using, so I'll focus first in this part only:
that changed a namespace in my webservice without me noticing and
breaking all clients
You can create unit tests for endpoints too, I do that all the time with spring-boot. For example, the code below starts the application, runs the testEndpoint() test and shuts down the application right after.
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = {MyApplication.class})
#WebAppConfiguration
#IntegrationTest("server.port:0")
public class MyControllerTest {
#Value("${local.server.port}")
private int port;
#Test
public void testEndpoint() {
String endpointUrl = "http://localhost:" + port;
// make a HTTP request here to test the endpoint
}
}
I believe this can be done with any spring-mvc application, but the code would be a bit different and not as easy as with spring-boot.
Although this would usually catch most bugs in endpoints, it doesn't eliminate the need of integration tests against a deployed project.
So focusing on the bigger picture now, if you want to create end-to-end tests, including WebServices, JMS queues and databases, I suggest creating a separate project with tests only, probably using Cucumber or something similar. This project should be triggered in Jenkins (or any other CI tool) whenever needed (e.g. before a deployment, every hour, every commit and/or every night) and it will require that all applications are already deployed.
Alternatively, you could have Jenkins deploy an application and run integration tests only against this one application. In this case the tests will depend on all other applications to be already deployed.

Best approach to integration test a Flex/Java web application via Maven?

I am working on a project that is developing a webapp with a 100% Flex UI that talks via Blaze to a Java backend running on an application server. The team has already created many unit tests, but have only created integration tests for the persistence module. Now we are wondering the best way to integration test the other parts. Here are the Maven modules we have now, I believe this is a very typical design:
Server Side:
1) a Java domain module -- this only has unit tests
2) a Java persistence module (DAO) -- right now this only has integration tests that talk to a live database to test the DAOs, nothing really to unit test here
3) a Java service module -- right now this only has unit tests
Client Side:
4) a Flex services module that is packaged as a SWC and talks to the Java backend -- currently this has no tests at all
5) a Flex client module that implements the Flex UI on top of the Flex services module - this has only unit tests currently (we used MATE to create a loosely couple client with no logic in the views).
These 5 modules are packaged up into a WAR that can be deployed in an application server or servlet container.
Here are the 4 questions I have:
Should we add integration tests to the service module or is this redundant given that the persist module has integration tests and the service module already has unit tests? It also seems that integration testing the Flex-Services module is a higher priority and would exercise the services module at the same time.
We like the idea of keeping the integration tests within their modules, but there is a circularity with the Flex services module and the WAR module. Integration test for the Flex services module cannot run without an app-server and therefore those tests will have
to come AFTER the war is built, yes?
What is a good technology to
integration test the Flex
client UIs (e.g. something like
Selenium, but for Flex)?
Should we put final integration tests in the
WAR module or create a separate
integration testing module that gets built after the WAR?
Any help/opinions is greatly appreciated!
More an hint than a strong answer but maybe have a look at fluint (formerly dpUInt) and the Continuous Integration with Maven, Flex, Fliunt, and Hudson blog post.
First off, just some clarification. When you say "4) Flex services module packaged as a SWC", you mean a Flex services library that I gather is loaded as an RSL. It's an important differential than writing the services as a runtime module because the latter could (and typically would) instantiate the services controller itself and distribute the service connection to other modules. Your alternative, simply a library you build into each module means they all create their own instance of a service controller. You're better off putting the services logic into a module that the application can load prior to the other modules loading and manages the movement of services between.
Eg.
Application.swf - starts, initialises IoC container, loads Services.swf, injects any dependencies it requires
Services.swf loads, establishes connection to server, manages required service collection
Application.swf adds managed instances from Services.swf into it's container (using some form of contextual awareness so as to prevent conflicts)
Application.swf loads ModuleA.swf, injects any dependencies it requires
ModuleA.swf loads, (has dependencies listed that come from Services.swf injected), uses those dependencies to contact services it requires.
That said, sticking with your current structure, I will answer your questions as accurately as possible.
What do you want to test in integration? That your services are there and returning what you expect I gather. As such, if using Remote Objects in BlazeDS, then you could write tests to ensure you can find the endpoint, that the channels can be found, the destination(s) exists, that all remote methods return as expected. The server team are testing the data store (from them to the DB and back), but you are testing that the contract between your client and the server still holds. This contract is for any assumptions - such as Value Objects returned on payloads, remote methods existing, etc, etc.
(See #4 below) The tests should be within their module however I would say here that you really should have a module to do the services (instead of a library as I suggested above). Regardless, yes still deploy the testing artifacts to a local web-server (using Jetty or some such) and ensure the integration tests goal depends on the WAR packager you use.
I find some developers interchange UI/functional testing with integration testing. Whilst you can indeed perform the two together, there is still room for automated integration tests in Flex where a webserver is loaded up and core services are checked to ensure they exist and are returning what is required. For the UI/functional tests, Adobe maintain a good collection of resources: http://www.adobe.com/products/flex/related/#ftesting. For integration tests as I mentioned,
Integration tests should have their own goal that depends the packaged WAR project.

Categories

Resources