I would like to find a reliable way to test my Maria DB schema with Jooq.
This is what I have now:
let Jooq with Gradle to extract an XML schema from the real DB (i.e. Maria instance) via org.jooq.codegen.XMLGenerator, this step will generate a maria_information_schema.xml;
use this schema to generate all the Java Classes.
To test all the classes I will write I have develop a technique:
fire a H2 embedded in ram database;
build a java InitDatabase.java class which manually set a DB schema as similar as possible with the MariaDB one;
preform all the test on the in ram DB.
This procedure works perfectly as far as I don't change something on the real DB and I forgot to do the same on the InitDatabase.java class.
My question is: is there a way to use the XML schema or the generated java classes to create an H2 Database with the same schema as the MariaDb one without writing manually all the create table statement?
Thanks
This is an open ended question with no obvious, "correct" answer. jOOQ's official take here is that you may want to re-consider using alternative RDBMS for testing what you could be doing with your target RDBMS directly, specifically using testcontainers.
You could combine this approach with jOOQ's code generation for a more streamlined development process.
In short, jOOQ's recommendation is to use:
Flyway or Liquibase for database change management
Testcontainers for code generation
Testcontainers for integration testing
Other, equivalent products are obviously also possible.
Related
I am testing this api that creates databases/tables in postgres. For automated testing, i was thinking along the lines of, having a setup method that creates a database with tables setup and populated with required data (1000 entries/rows).
I was thinking of an elegant way of doing this? Any thoughts apart from writing code that loops over 1000 times and writing data stored in a csv to postgres table?
Honestly CSV, XML, or any other structured format seems fine to me. Is there a reason you don't want to do that? Using the pg_dump command to export data from an existing DB and using pg_restore could be a good option too.
Your other idea about writing code to generate the data isn't bad either. The benefit of writing code is that your test isn't coupled to a data file.
Also, I would take a look at the H2 database because it has PostgreSQL compatibility mode, and you can actually embed it in your unit/integration tests instead of relying on a PostgreSQL server to be set up and configured in your tests. We've used H2 to test our PostgreSQL app, and it's worked well. The downside is you can't be 100% sure that just because your test passes against H2 that it will against PostgreSQL.
If you really prefer to use postgresql (instead of H2) for testing you could use liquibase. It's a database schema management tool that supports (among others) bulk loading data from csv (http://www.liquibase.org/documentation/changes/load_data.html).
It also offers spring integration if your application use it.
This question is extracted from a comment I posted here:
What's the best strategy for unit-testing database-driven applications?
So I have a huge database schema for a legacy application (with quite an old code base) that has many tables, synonyms, triggers, and dblinks. We and we have (finally) started to test some part of the application.
Our tests are already using mocks, but in order to test the queries that we are using we have decided to use an in-memory db with short-lived test dataset.
But the setup of the in-memory database requires a specific SQL script for the db schema setup. The script is not the real DDL we have in production because we can not import it directly.
To make things harder, the database contains functions and procedures that needs to be implemented in Java (we use the h2 db, and that is the way to declare procedures).
I'm afraid that our test won't break the day the real db will change and we will spot the problem only at runtime, potentially in production.
I know that our tests are quite at the border between integration and unit. However with the current architecture it is quite hard to insulate the test from the db. And we want to have proper tests for the db queries (no ORM inside).
What would be solution to have a DDL as close as possible of the real one and without the need to manually maintain it ?
If your environments are Dockerized I would highly suggest checking out Testcontainers (https://www.testcontainers.org/modules/databases/). We have used it to replace in-memory databases in our tests with database instances created from production DDL scripts.
Additionally, you can use tmpfs mounting to get performance levels similar to in-memory databases. This is nicely explained in following post from Vlad Mihalcea: https://vladmihalcea.com/how-to-run-integration-tests-at-warp-speed-with-docker-and-tmpfs/.
This combination works great for our purposes (especially when combined with Hibernate auto-ddl option) and I recommend that you check it out.
Is there any way to test that SQL scripts contain standard SQL with java/junit tests?
Currently we have sql scripts for creating a database etc. in a Postgres db, but when using hsqldb everything fails. That's why I wonder if any java tools exist for testing if sql statements are standard sql.
Or would it just be wise to create different sets of scripts per database vendor?
If so, is there a way to test if a given script works with postgres/hsqldb?
The H2 database supports different modes, which may help you with postgres testing, I've found that our sql often contains functions which are not supported but H2, but you can create your own "stored procedures" which actually invoke a static Java method to work around this. If you want to support different database vendors you should go down the vendor specific script route, unless you are doing really basic queries.
If you have the available resources I would recommend setting up a fully fledged UAT environment which you can use to test against a live postgres database, as even seemingly minor db configuration differences can impact query plans in unexpected ways.
I've usually made a very simple java-wrapper that tests this code
by using a localhost-connection with some standard user/pass settings.
Remember to use a temporary database or a known test-database so your tests
doesn't destroy anything important.
Reason for above is that I have had the need for specific databases (non standard
features etc).
If you only want to test standard sql-stuff for junit tests (like syntax, selects etc),
I would consider using a embedded sql database in java (ususally memory only).
That way it is easy to test lots of stuff without the need to install a db
and also without the risk of destoring other installations.
It sounds like you're looking for an SQL syntax parser and validator. The only Java SQL parser with which I'm familiar is Zql, but I've never actually used it.
A similar question was asked early last year, and the best answer there turned out to be writing your own parser with ANTLR.
The best tool for checking SQL statements for conformance to Standard SQL is HSQLDB 2.0. This is especially true with data definition statements. HSQLDB has been written to the Standard, as opposed to adopting bits of the Standard over several versions.
PostgresSQL has been slowly moving towards standard SQL. But it still has some "legacy" data types. HSQLDB allows you to define types with the CREATE TYPE statement for compatibility with other dialects. It also allows you to define functions in SQL or Java for the same purpose.
The best policy is to use standard SQL as much as possible, with user-defined types and functions to support an alternative database's syntax.
ZQL works fine with Junit..
ZqlParser parser = new ZqlParser(any input stream);
try{
parser = parser.readStatement();
}catch(ParseException e){
// if sql is not valid it caught here
}
Different vendors expose different additional features in their SQL implementation.
You may decide the set of databases to test. Then use http://dbunit.sourceforge.net to simplify the testing job.
Im working on some database migration code in Java. Im also using a factory pattern so I can use different kinds of databases. And each kind of database im using implements a common interface.
What I would like to do is have a migration check that is internal to the class and runs some database schema update code automatically. The actual update is pretty straight forward (I check schema version in a table and compare against a constant in my app to decide whether to migrate or not and between which versions of schema).
To make this automatic I was thinking the test should live inside (or be called from) the constructor. OK, fair enough, that's simple enough. My problem is that I dont want the test to run every single time I instantiate a database object (it runs a query so having it run on every construction is not efficient). So maybe this should be a class static method? I guess my question is, what is a good design pattern for this type of problem? There ought to be a clean way to ensure the migration test runs only once OR is super-efficient.
Have a look at liquibase.
Here's an ibm developerworks article that has a nice walk-thru http://www.ibm.com/developerworks/java/library/j-ap08058/index.html
Flyway fits your needs perfectly. It supports multiple databases, compares the schema version with the available migrations on the classpath and upgrades the database accordingly.
You can embed it in your application and have it run once on startup as described in the Flyway docs.
Note: Flyway also comes with a Maven plugin and the ability to clean an existing schema in case you messed things up in development.
[Disclaimer: I'm one of Flyway's developers]
I've been using the iBatis SQL Mapper and really like it. The next version, iBatis 3.0, has schema migrations support. This is still in beta, but I'm planning on using it when it gets closer to a release candidate.
I'm looking for a general solution for upgrading database schema with ORM tools, like JPOX or Hibernate. How do you do it in your projects?
The first solution that comes to my mind is to create my own mechanism for upgrading databases, with SQL scripts doing all the work. But in this case I'll have to remember about creating new scripts every time the object mappings are updated. And I'll still have to deal with low-level SQL queries, instead of just defining mappings and allowing the ORM tools to do all the job...
So the question is how to do it properly. Maybe some tools allow for simplifying this task (for example, I heard that Rails have such mechanism built-in), if so please help me decide which ORM tool to choose for my next Java project.
LiquiBase is an interesting open source library for handling database refactorings (upgrades). I have not used it, but will definitely give it a try on my next project where I need to upgrade a db schema.
I don't see why ORM generated schemas are any different to other DB schemas - the problem is the same. Assuming your ORM will spit out a generation script, you can use an external tool to do the diff
I've not tried it but google came back with SQLCompare as one option - I'm sure there are others.
We hand code SQL update scripts and we tear down the schema and rebuild it applying the update scripts as part of our continuous build process. If any hibernate mappings do not match the schema, the build will fail.
You can check this feature comparison of some database schema upgrade tools.
A comparison of the number of questions in SOW of some of those tools:
mybatis (1049 questions tagged)
Liquibase (663 questions tagged)
Flyway (400 questions tagged)
DBDeploy (24 questions tagged).
DbMaintain can also help here.
I think your best bet is to use an ORM-tool that includes database migration like SubSonic:
http://subsonicproject.com/2-1-pakala/subsonic-using-migrations/
We ended up making update scripts each time we changed the database. So there's a script from version 10 to 11, from 11 to 12, etc.. Then we can run any consecutive set of scripts to skip from some existing version to the new version. We stored the existing version in the database so we could detect this upon startup.
Yes this involved database-specific code! One of the main problems with Hibernate!
When working with Hibernate, I use an installer class that runs from the command-line and has options for creating database schema, inserting base data, and dynamically updating the database schema using SchemaUpdate. I find it to be extremely useful. It also gives me a place to put one-off scripts that will be run when a new version is launched to, for example, populate a new field in an existing DB table.