I am testing this api that creates databases/tables in postgres. For automated testing, i was thinking along the lines of, having a setup method that creates a database with tables setup and populated with required data (1000 entries/rows).
I was thinking of an elegant way of doing this? Any thoughts apart from writing code that loops over 1000 times and writing data stored in a csv to postgres table?
Honestly CSV, XML, or any other structured format seems fine to me. Is there a reason you don't want to do that? Using the pg_dump command to export data from an existing DB and using pg_restore could be a good option too.
Your other idea about writing code to generate the data isn't bad either. The benefit of writing code is that your test isn't coupled to a data file.
Also, I would take a look at the H2 database because it has PostgreSQL compatibility mode, and you can actually embed it in your unit/integration tests instead of relying on a PostgreSQL server to be set up and configured in your tests. We've used H2 to test our PostgreSQL app, and it's worked well. The downside is you can't be 100% sure that just because your test passes against H2 that it will against PostgreSQL.
If you really prefer to use postgresql (instead of H2) for testing you could use liquibase. It's a database schema management tool that supports (among others) bulk loading data from csv (http://www.liquibase.org/documentation/changes/load_data.html).
It also offers spring integration if your application use it.
Related
I would like to find a reliable way to test my Maria DB schema with Jooq.
This is what I have now:
let Jooq with Gradle to extract an XML schema from the real DB (i.e. Maria instance) via org.jooq.codegen.XMLGenerator, this step will generate a maria_information_schema.xml;
use this schema to generate all the Java Classes.
To test all the classes I will write I have develop a technique:
fire a H2 embedded in ram database;
build a java InitDatabase.java class which manually set a DB schema as similar as possible with the MariaDB one;
preform all the test on the in ram DB.
This procedure works perfectly as far as I don't change something on the real DB and I forgot to do the same on the InitDatabase.java class.
My question is: is there a way to use the XML schema or the generated java classes to create an H2 Database with the same schema as the MariaDb one without writing manually all the create table statement?
Thanks
This is an open ended question with no obvious, "correct" answer. jOOQ's official take here is that you may want to re-consider using alternative RDBMS for testing what you could be doing with your target RDBMS directly, specifically using testcontainers.
You could combine this approach with jOOQ's code generation for a more streamlined development process.
In short, jOOQ's recommendation is to use:
Flyway or Liquibase for database change management
Testcontainers for code generation
Testcontainers for integration testing
Other, equivalent products are obviously also possible.
I have spring boot application which is has spring ws and backend as Stored procedure, I am trying to write integration tests for this and looking for inmemmory database which supports stored procedure?
i tried with h2 database but it looks for java functions to work on stored procedure. i sthere any direct mechanism where I can put my stored procedure with minimal effort?
so, there is no ready made solution. i needed this for integration testing but solution provide by h2 database required me to rewrite the stored procedure in java classes. http://www.h2database.com/html/features.html#user_defined_functions
Considering the pain to write integration test and amount of effort, I decided to go against it. Hope it may help to some else.
This question is extracted from a comment I posted here:
What's the best strategy for unit-testing database-driven applications?
So I have a huge database schema for a legacy application (with quite an old code base) that has many tables, synonyms, triggers, and dblinks. We and we have (finally) started to test some part of the application.
Our tests are already using mocks, but in order to test the queries that we are using we have decided to use an in-memory db with short-lived test dataset.
But the setup of the in-memory database requires a specific SQL script for the db schema setup. The script is not the real DDL we have in production because we can not import it directly.
To make things harder, the database contains functions and procedures that needs to be implemented in Java (we use the h2 db, and that is the way to declare procedures).
I'm afraid that our test won't break the day the real db will change and we will spot the problem only at runtime, potentially in production.
I know that our tests are quite at the border between integration and unit. However with the current architecture it is quite hard to insulate the test from the db. And we want to have proper tests for the db queries (no ORM inside).
What would be solution to have a DDL as close as possible of the real one and without the need to manually maintain it ?
If your environments are Dockerized I would highly suggest checking out Testcontainers (https://www.testcontainers.org/modules/databases/). We have used it to replace in-memory databases in our tests with database instances created from production DDL scripts.
Additionally, you can use tmpfs mounting to get performance levels similar to in-memory databases. This is nicely explained in following post from Vlad Mihalcea: https://vladmihalcea.com/how-to-run-integration-tests-at-warp-speed-with-docker-and-tmpfs/.
This combination works great for our purposes (especially when combined with Hibernate auto-ddl option) and I recommend that you check it out.
For a project I am working on(Spring/struts 2/hibernate), we decided to use h2 for unit testing with MySQL for the production store and manage the scheme in liquibase, pretty standard fare, but the issue we keep on running into is that h2 and MySQL differ in a lot of ways, for example how they handle timestamps and triggers. It's getting to the point that I am starting to regret using h2 as the extra headaches the mis-matches are causing are starting to outweigh its benefits. My question is this, is there any other in-memory/local file database that behaves more like MySQL? Obviously for integration testing we will still use MySQL, but being able to do unit testing without either making the liquibase files into a giant hack or having to ensure the local MySQL db is running would be nice.
I don't think there is another in-memory Java database that is more compatible to MySQL than H2. If you have a lot of code that only works with MySQL, then you should probably also use MySQL for testing.
Just be aware that it will be difficult in the future to switch to another database. Relying too much on features of one product will result in a "vendor lock in". In case of MySQL at least you have the option to switch to MariaDB, so it's not all that bad.
You may use a ram drive, copy your testing tables and datas into that drive, and start your mysql configured to load from that drive, all that in a script at boot time.
Then your unit tests will run insanely faster. We used it for developpers workstations and the level of frustation went three steps down.
I think that as of right now the correct approach is to use MySQL as a Docker image.
Once you create the image you can easily spin it up from your tests, and it's going to take seconds. Your hibernate will dynamically initialize DB schema and there you go!
The only issue is that CI servers need to have Docker installed.
What kind of tools are available for populating test data in mongodb. We have used dbunit in the past, but it doesn't seem to have an equivalent maven plugin.
http://eliothorowitz.com/post/459890033/streaming-twitter-into-mongodb how does this look? Now all you need is a JSON or CSV generator which is much easier to be found.
First question is from what source you want load data? From another mongodb, from sql, from xml, from text file, etc..
As for me, i using some helper classes(one per each my mongo document) thats create some entity, that i need during testing.
Also i have one test class per each my business object, and before test start i run helpers in order to create test environment, after test finished i delete all created data.
Such approach work even at production database.
I can suggest another approch. For example if you have some production mongo db with data you can copy mongo data, run new mongo db on this data and run your tests and after tests has finished, delete this db.