Our development databases (Oracle 9i) use a remote database link to a remote shared database.
This decision was made years ago when it wasn't practical to put some of the database schemas on a development machine - they were too big.
We have certain schemas on the development machines and we make the remote schemas look local by using Oracle's database links, together with some synonyms on the development machines.
The problem I have is that I would like to test a piece of SQL which joins tables in schemas on either side of the database link.
e.g. (a simplified case):
select a.col, b.col
from a, b
where a.b_id = b.id
a is on the local database
b is on the remove database
I have a synonymn on the locale DB so that 'b' actually points at b#remotedb.
Running the query takes ages in the development environment because of the link. The queries run fine in production (I don't think the Oracle cost based optimiser can cope very well with database links).
We have not been very good at writing unit tests for these types of queries in the past - probably due to the due to the poor performance - so I'd like to start creating some tests for them.
Does anyone have any strategies for writing a unit test for such a query, so as to avoid the performance problems of using the database link?
I'd normally be looking at ways of trying to mock out remote service, but since all this is in a SQL query, I can't see anyway of easily mocking out the remove database.
You should create exact copies of all the schema you need from production on development but without all the data. You should populate the schema with enough data so you can do a proper test. You can also manipulate the optimizer to behave on the test system to be like production by exporting the statistics from the production server and importing them to the development database for the schemas you are duplicating. That way the query will run with the data set you've made but the query will optimize with plans that is similar to that of production. Then you can estimate theoretically how it will scale on production.
Copy the relevant data into your development database and create the tables locally.
Ideally, just build a test case which tells you:
The SQL is correct (it parses)
It operates correctly with a few rows of test data
Don't fall for the "let's copy everything" because that means you'll have no idea what you're testing anymore (and what you're missing).
If in doubt, create a table b with just a single record. If you get an error in this area, add more rows as you learn where it can fail.
If you want to take this to the edge, create the test table (with all data) in a unit test. This way, you can document the test data you're using.
[EDIT] What you need is a test database. Don't run tests against a database which can change. Ideally, the tests should tear down the whole database and recreate it from scratch (tables, indexes, data, everything) as the first step.
In this test database, only keep well defined test data that only changes by defining new tests (and not by someone "just doing something"). If you can, try to run your tests against an in-memory database.
I would suggest materialized views. These are views that store remote data locally.
In theory to do the unit-testing you can work with any set of controlled data created and designed based on your test-cases. It doesn't have to be your live or development system. That's assuming your unit is portable enough. You would test it with your current databases/application when you come to integration testing, which might as well be on the live system anyway (so no db links will be required - I understand your live databases are in one place).
What I'm trying to say, is that you can/should test your unit (i.e. your component, query or whatever you define as a unit) on a controlled set of data that would simulate different 'use cases' and once you complete your testing to satisfactory results, then you can proceed to integration + running integration tests.
Integration tests - you could run this in the live environment, but only after you've proved by unit-testing that your component is 'bullet-proof' (if that's OK with your company's approach/philosophy :) - sys admin's reaction:"Are you flippin creazy?!")
If you are trying to go back in time and test already implemented units, then why bother? If they've been in a production use for some time without any incidents then I would argue that they're OK. However, there's always a chance that your unit/query might have some 'slowly ticking time bomb' effect on the side (cumulative effect over time). Well, analyse the impact is the answer.
Related
I use Play 2.3 with Hibernate.
On starting up the application the first time, I want to have some data inserted into the database as default values.
In my case I have an entity class "Studycourse". All tables are created through JPA on first run.
I use DB evolution (1.sql) to insert the default data, e.g.:
INSERT INTO studycourse (id, title) VALUES (1, 'Computer Science');
This works when using the normal "activator run" command. But if I do "activator test" and start a simple integration test with inMemoryDatabase(), I get following error:
[error] play - Table "STUDYCOURSE" not found; SQL statement: INSERT INTO studycourse (id, title) VALUES (1, 'Computer Science')
I guess, that the initial JPA setup is not done in the in-memory DB.
Question: Is there a best practice on how to do this?
The integration test looks like:
public class IntegrationTest {
#Test
public void test() {
running(testServer(3333, fakeApplication(inMemoryDatabase())), HTMLUNIT, new Callback<TestBrowser>() {
public void invoke(TestBrowser browser) {
browser.goTo("http://localhost:3333");
assertThat(browser.pageSource()).contains("Your new application is ready.");
}
});
}
}
Thanks in advance.
Your original question was essentially asking "How can I execute my JPA initialization steps in a test environment so that my in-memory database is populated when I run integration tests involving my database?".
My answer will not directly address that but it will summarize how we solve the same underlying issues you're trying to solve.
My interpretation of your objectives is that you want to:
Establish a good practice for database schema migration
Establish a common practice for database integration testing
Database Schema Migration
As I mentioned in my comment above, we use http://flywaydb.org for database schema migrations and it has been an outstanding tool. Flyway has an SBT plugin so you can run flywayClean and flywayMigrate right from activator to delete and re-initialize your database instantly.
Flyway supports sophisticated file name versions so that you can execute SQL scripts like v1.1.0.sql, v1.1.1.sql, and v1.2.0.sql. Flyway will also complain if you try to execute migration scripts that are not a pure improvement on the existing state of the database. This means we're using flyway to push database schema migrations to production, resting confident that this will fail if we've done something silly. Of course, we always take a DB backup right before the migration just to be safe.
Finally, Flyway will even let you execute java programs to populate the database in case you want to use service methods instead of just raw SQL.
Anyway, your choices here are basically Play evolutions, Flyway, or Liquibase.
In-Memory Database vs. Dev-Database
On this issue, I've seen two primary positions:
Never test on an in-memory database because then your tests won't
reveal the subtle differences between your in-memory database and
your production database, or
Use an in-memory database for local testing, but at least have your
build server use a dev database.
You can see, for example, the comments at the end of http://blog.jooq.org/2014/06/26/stop-unit-testing-database-code/.
Option #1 gives you higher speed overall, but delays the feedback time between writing bad code and getting a failing integration test.
Option #2 gives you slightly lower speed overall, but gives you immediate feedback on the real database.
As with most things in engineering, there is no "best" solution, just a set of tradeoffs which make the most sense for your team.
Choosing an ORM Layer
We initially began with Hibernate but eventually switched to http://jooq.org. See http://www.vertabelo.com/blog/technical-articles/jooq-a-really-nice-alternative-to-hibernate for a jOOQ-positive overview, and http://blog.jooq.org/2012/04/21/jooq-and-hibernate-a-discussion/ for a good discussion on the two.
Hibernate seemed attractive to us because it was so mature and so popular, but when we began running into classic SQL vs. Object-Oriented impedance mismatches like how to handle inheritance, Hibernate required a learning curve and some setup overhead.
We reasoned that, if we're going to incur that overhead at all, why not just use SQL directly to do the mappings? So, we switched to JOOQ and have been able to write some very clean, elegant, and testable code. If you're not too far down the hibernate path, I would encourage you to take a look at jOOQ.
If you're already deep into Hibernate, and it's working well for you, there's probably little value to switching.
Best Practices for Database Integration Testing
I wondered this exact question and posted about it at https://groups.google.com/forum/#!topic/jooq-user/GkBW5ZGdgwQ. Lukas, the author of jOOQ responded with some general remarks.
At this point, we integration test most of our DAO's and service classes. Our tests are run after flywayClean and flywayMigrate have been run. Then, each test is written to clean up after itself. The biggest issue is performance, which so far is not a problem, but may be an issue later.
I also posted on that and received a helpful answer. See https://groups.google.com/d/msg/play-framework/BgOCIgz_9q0/jBy8zxejPEkJ.
Disclaimer: we are close to launching our app but not yet running it in production, so others may have additional best practices to add.
This question is extracted from a comment I posted here:
What's the best strategy for unit-testing database-driven applications?
So I have a huge database schema for a legacy application (with quite an old code base) that has many tables, synonyms, triggers, and dblinks. We and we have (finally) started to test some part of the application.
Our tests are already using mocks, but in order to test the queries that we are using we have decided to use an in-memory db with short-lived test dataset.
But the setup of the in-memory database requires a specific SQL script for the db schema setup. The script is not the real DDL we have in production because we can not import it directly.
To make things harder, the database contains functions and procedures that needs to be implemented in Java (we use the h2 db, and that is the way to declare procedures).
I'm afraid that our test won't break the day the real db will change and we will spot the problem only at runtime, potentially in production.
I know that our tests are quite at the border between integration and unit. However with the current architecture it is quite hard to insulate the test from the db. And we want to have proper tests for the db queries (no ORM inside).
What would be solution to have a DDL as close as possible of the real one and without the need to manually maintain it ?
If your environments are Dockerized I would highly suggest checking out Testcontainers (https://www.testcontainers.org/modules/databases/). We have used it to replace in-memory databases in our tests with database instances created from production DDL scripts.
Additionally, you can use tmpfs mounting to get performance levels similar to in-memory databases. This is nicely explained in following post from Vlad Mihalcea: https://vladmihalcea.com/how-to-run-integration-tests-at-warp-speed-with-docker-and-tmpfs/.
This combination works great for our purposes (especially when combined with Hibernate auto-ddl option) and I recommend that you check it out.
I am developping an application that tests different WebServices, and I want it to be as generic as possible. I need to populate database to do JUnit tests, but I don't want these changes to be commited.
I know that some in-memory databases like HSQL DB allow testing on a sort of a virtual (or mock) database, but unfortunately I use oracle and I cannot change it now because of my complex data tables structure.
What is the best practice you suggest?
Thanks.
First of all, HSQL and Hibernate aren't related in any way. The question is whether you can find an embedded database which supports the same SQL as your production database (or rather the subset of SQL which your application uses).
A good candidate for this is H2 database since it emulates a lot of different SQL flavours.
On top of that: Don't test the database. Assume that the database is tested thoroughly by your vendor and just works.
In my code, I aim for:
Save and load each entity.
Generate the SQL for all the queries that I use and compare them against String literals in tests (i.e. I don't run the queries against the database all the time).
Some tests look for a System property. If it's set, then they will run the queries against the database. This happens during the night on my CI server.
The rationale for this: As long as the DB schema doesn't change, there is no point to actually run the queries. That means running them during the day while I sit in front of the computer is a huge waste of time.
To make sure that "low impact" changes don't slip through the gaps, I let a computer run them when I don't care.
Along the same lines, I have mocks for many DAOs which return various predefined results, so I don't have to query the database. The rationale here is that I want to test the processing of results from the database, not the JDBC API, the DB driver, the OS's TCP/IP stack, the network hardware (and software), or any other of the 1000 things between my code and the database records on a harddisk somewhere.
More details in my blog: http://blog.pdark.de/2008/07/26/testing-with-databases/
Here is my scenario:
I have a java application that reads data from a table T1 of database D1, processes it and puts it in another table T2 of another database D2. This happens real time, i.e., as and when a record is inserted or updated in table T1, the application will pick the data, process it and pushes it to the destination table. I wish to monitor performance of this application using a testing(preferrably JUnit) and/or performance framework. In my test case I wish to have following
Insert and update a fixed number of records for fixed time at fixed intervals on table T1 of database D1.
After a fixed time, either check the number of records that are present in T2 of database D2 or look for existence of a specific record.
The tests that I wish to create should be
Database agnostic
Provide results that can show trends and be configurable with a CI tool like Jenkins
So, my question is, what is the best way to test this kind of scenario? Are there any available tools that will help me achieve this?
Database agnostic
In order to achieve that I would suggest to use simplest possible SQL and some low-level JDBC abstraction layer:
DBUtils
The Commons DbUtils library is a small set of classes designed to make
working with JDBC easier. JDBC resource cleanup code is mundane, error
prone work so these classes abstract out all of the cleanup tasks from
your code leaving you with what you really wanted to do with JDBC in
the first place: query and update data.
MyBatis
MyBatis is a first class persistence framework with support for custom
SQL, stored procedures and advanced mappings. MyBatis eliminates
almost all of the JDBC code and manual setting of parameters and
retrieval of results. MyBatis can use simple XML or Annotations for
configuration and map primitives, Map interfaces and Java POJOs (Plain
Old Java Objects) to database records.
Both will do the trick for you. With good attention to details you'll manage to provide flexible enough solution and test as many databases as you want.
Provide results that can show trends and be configurable with a CI tool like Jenkins
Define several KPIs and make sure you can get all values periodically. For example you can measure a throughput (records per second). Export data periodically (as CSV or properties for example) and use PlotPlugin for visualization:
You can also check relevant question: How do I plot benchmark data in a Jenkins matrix project
Proper testing
Please make sure your testing strategy is well defined and you will not miss anything:
Load testing
Stress testing
Since I'm not really proficient with databases, some details may be irrlevant, but I'll include everything:
As part of a project in my University, we're creating a website that uses JSP, servlets and uses a MySQL server as backend.
I'm in charge of setting up the tables on the DB, and creating the Java classes to interact with it. However, we can only connect to the MySQL server from inside the University, while we all (7 people) work mostly at home.
I'm creating an interface QueryHandler which has a method that takes a string (representing a query) and returns ResultSet. My question is this: How do I create a class that implements this interface which will simulate a database and allow others to use different DBHandlers and not know the difference and allow me to test different queries without connecting to the actual MySQL database?
EDIT: I'm not so sure on the differences between SQL databases, but obviously all the queries I run on MySQL should run on the mock.
Why not just install your own MySQL database for testing? It runs on Windows, Mac and Linux, and it's not too resource heavy. I have it installed on my laptop for local testing.
Your API appears to be flawed. You should not be returning ResultSets to clients. By doing so, you are forever forcing your clients to rely on a relational database backend. Your data access layer needs to hide all of the details of how your data is actually structured and stored.
Instead of returning a ResultSet, consider returning a List or allowing the client to supply a Stream that your data access component can write to.
This will make unit tests trivial for the clients of the API and will allow you to swap storage mechanisms at will.
Try derby. It's a free server you can use to test against, if you don't mind having to change drivers when you go back to SqlServer. You might be limited in the kind of queries you can run though. I'm not sure if SqlServer has any special syntax outside of standard SQL.
How about using a HSQLDB for offline tests? It wont behave exactly like a MySQL DB but is a fast in memory SQL DB that should satisfy most of your needs.
The best way in my experience is multiple database instances and or schemas. Normally you'd have one for each user to do their development against/sanity checking the running application, one for an automated build for running unit tests and ideally one for each user to run their unit tests against. And of course instances/schemas for demos, integration testing. Apart from the practial side, being able to do this ensures deploying/upgrading the app/database will be pretty near faultless too.
Assuming you have a DAO layer, the only code that needs access to a real database at the unit test level is the DAO implementation, the business layer should be using a mock DAO implementation.