How do we migrate/update the database schema in Grails? - java

We've been working with Grails for a while and my Team Lead raised some questions about the Grails ORM (GORM):
How do we maintain the database schema once we have moved to production?
Can we update the database schema with Grails?
If the schema is updated, will the changes be automatically reflected / does the framework take care of this?
Is there any plugin for Grails that will allow us to update the schema without headaches?

I recently released the official Grails plugin for database migrations - see http://grails.org/plugin/database-migration and the docs at http://grails-plugins.github.com/grails-database-migration/docs/manual/index.html
I'm working with the author of Liquibase on this, so the older liquibase plugin is now deprecated and the new one should be used since it uses the latest version of Liquibase (2.0) and is officially supported by SpringSource. See http://blog.liquibase.org/2011/01/new-standard-liquibase-plugin-grails-database-migration.html for his announcement.
Ask usage questions on the Grails User mailing list (signup from http://grails.org/Mailing+lists) or the new plugin forum at http://grails-plugins.847840.n3.nabble.com/ or email the author directly :)

Remove dbCreate parameter in DataSource.groovy for your production environment - this will stop GORM from auto-updating DB schema.
Sure. Use LiquiBase plugin.
GORM can do it with dbCreate='update', but it's strongly not recommended. For instance, if you rename a field, GORM/LiquiBase can never determine that you have to migrate the data, and not just drop+create.
In one line: grails db-diff to generate LiquiBase's changelog.xml, and grails migrate -Dgrails.env=<whatever environment> to apply it to respective db server.

While the "auto create" functionality is ok to get a project up and running I find liquibase the best way to keep the db up-to-date. There is a grails plugin and I believe work is under way on a DSL too.
So, create a baseline schema (you could use liquibase's generate-changelog) then make all future changes through liquibase and it will manage the updates, rollbacks and even some db interop for you. You can set your DataSource.groovy config to verify and grails will not start up if the schema does not match the domain config:
environments {
development {
dataSource {
dbCreate = "validate"
You may also be interested in the liquibase-runner plugin to run your migrations on application start.

Related

How to use Spring boot, JOOQ and Flyway together?

So, let's consider a general Spring boot application, which uses JOOQ for database database access, and Flyway for database migration. The project uses gradle for dependency management.
I want the following things:
Run my application in docker. So, I want to use only in environment variables (https://12factor.net/config). Hence, I don't know, how to configure both spring boot application properties (database login and password) and gradle JOOQ plugin database login and password.
Automatic generation JOOQ classes. Flyway migration runs, when an application has started. But JOOQ generates code in gradle build task. So, we see wrong order of tasks execution.
I have a very similar setup, but resorted to manual action to generate Jooq classes.
I need them for development, so it makes no sense for me to delay the generation till target environment.
I decided to run a local dB for development purposes.
I run it in docker, but this is a detail in the entire setup.
When I have a new migration, I run it with flyway grade plugin against the local dB. Then I regenerate Jooq classes with grade Jooq plugin.
When the app is deployed in the target environment, I rely on flyway to run migration on startup. I have matching Jooq classes packaged, so everything works smoothly.
The jOOQ GitHub project has an example project that uses jOOQ with Spring Boot and the sql-maven-plugin.
You can easily replace the sql-maven-plugin by the Flyway plugin as demonstrated in the jOOQ/Flyway example project or this blog post.
On a related note, in case you're using one of the commercial distributions of jOOQ with Spring Boot, this is documented in this blog post here.
There is the following gradle task, which requires flyway, otj-pg-embedded, jooq and postgresql driver:
import com.opentable.db.postgres.embedded.*
import org.flywaydb.core.*
import org.jooq.codegen.*
tasks.named("compileKotlin") {
doFirst {
//create embedded postgresql
EmbeddedPostgres.builder().setPort(5400).start().use {
//migrate embedded posrtgresql
Flyway.configure()
.locations("filesystem:$projectDir/migrations/")
.schemas("public")
.dataSource(it.postgresDatabase)
.load()
.migrate()
//generate jooq classes
GenerationTool.generate("some xml for jooq")
}
}
}
The source is https://gist.github.com/whyoleg/63195b60eb85e8fe2114b30f28b892ef

How to handle DCL in Flyway migration scripts?

I have a postgres database that has users and roles already defined. There are multiple schemas in this database that are all controlled via different projects/flyway scripts. I am working on adding Flyway integration into a new project where we will use an embedded Postgres instance for testing.
Since none of these users/roles will exist on this instance, they need to be created in a migration script. However, since these users/roles will already exist in my operational databases, the migrations will fail when they attempt to create the roles.
I already considered writing a function for this, but then the function would have to be included in any project that uses the embedded Postgres and would have to be maintained across multiple code bases. This seems very sloppy. Can anyone recommend a way for me to handle these DCL operations using Flyway that will work with the embedded approach as well as my operational databases?
In a previous project we use for this approach a set of additional Flyway migration scripts. These scripts we add to the test environment classpath.
We used this for a Flyway version before the feature of Callback and repeatable migrations were added.
Add a callback configuration for your Test environment and you add in the before or after migration phase your user and roles.
Third solution use repeatable migration scripts for your user and roles setup see https://flywaydb.org/documentation/migration/repeatable. Use this scripts in production and test. But in this case your sql must done correct and repeatable otherwise you will break your production environment.

IntelliJ IDEA Hibernate

I'm learning hibernate and I am running into some issues. I'm reading "Harnessing Hibernate" by O'Reilly. They explain everything using ANT, but since I want to avoid writing a huge build.xml file, I'm trying to get it to work with IntelliJ.
I managed to make a mapping according to a DB table in a MySQL database, and wrote the bean for it. It worked, but I can't find any information on how to generate beans and SQL code, or how to reverse engineer with IntelliJ. I found loads of tutorials about Eclipse, using JBOSS Hibernate tools plugin, and the site claims this support for generating code is already in the standard installation of IntelliJ.
Am I forgetting some configuration such as adding libraries? I'm trying to find this out but I'm desperate now. Please don't suggest me to use Eclipse, I need IntelliJ for my current role.
AFAIK, IntelliJ IDEA includes the complete JPA/Hibernate support in its Ultimate Edition:
Generating Persistence Mappings from Database Schema
IntelliJ IDEA allows you to quickly
generate persistence mappings from any
database schema: Generating
Persistance Mappings
(source: jetbrains.com)
Now, the question is, what edition of Intellij IDEA are you using?
If you add the hbm2ddl to your Hibernate config and ask it to create the database schema you'll get it by running a single test or some other code that exercises Hibernate. Once you have it, turn off create.
Let Hibernate do the work.

how to close hibernate tools db connections

When developing a new project I often want to re-create the schema to apply any new entities or relationships created. I like using hibernate tools in eclipse, but it's a pain when wanting drop and re-create the schema - since it seems to maintain open connections to the db (postgres in this case).
Does anybody know if there is an easy way of getting the eclipse hibernate tools plugin to close off all connections?
It might be helpful to tell you, that i was using Hibernate Tools for creating the schema, too. I never had the problem you are describing. Nowadays i use liquibase for schema migration. It is a different approach, but i like it.
I'v never hit this problem before, but we use Maven run from the command line to create/update the schema and also DBUnit to upload data. Then back to elcipse...

How to upgrade database schema built with an ORM tool?

I'm looking for a general solution for upgrading database schema with ORM tools, like JPOX or Hibernate. How do you do it in your projects?
The first solution that comes to my mind is to create my own mechanism for upgrading databases, with SQL scripts doing all the work. But in this case I'll have to remember about creating new scripts every time the object mappings are updated. And I'll still have to deal with low-level SQL queries, instead of just defining mappings and allowing the ORM tools to do all the job...
So the question is how to do it properly. Maybe some tools allow for simplifying this task (for example, I heard that Rails have such mechanism built-in), if so please help me decide which ORM tool to choose for my next Java project.
LiquiBase is an interesting open source library for handling database refactorings (upgrades). I have not used it, but will definitely give it a try on my next project where I need to upgrade a db schema.
I don't see why ORM generated schemas are any different to other DB schemas - the problem is the same. Assuming your ORM will spit out a generation script, you can use an external tool to do the diff
I've not tried it but google came back with SQLCompare as one option - I'm sure there are others.
We hand code SQL update scripts and we tear down the schema and rebuild it applying the update scripts as part of our continuous build process. If any hibernate mappings do not match the schema, the build will fail.
You can check this feature comparison of some database schema upgrade tools.
A comparison of the number of questions in SOW of some of those tools:
mybatis (1049 questions tagged)
Liquibase (663 questions tagged)
Flyway (400 questions tagged)
DBDeploy (24 questions tagged).
DbMaintain can also help here.
I think your best bet is to use an ORM-tool that includes database migration like SubSonic:
http://subsonicproject.com/2-1-pakala/subsonic-using-migrations/
We ended up making update scripts each time we changed the database. So there's a script from version 10 to 11, from 11 to 12, etc.. Then we can run any consecutive set of scripts to skip from some existing version to the new version. We stored the existing version in the database so we could detect this upon startup.
Yes this involved database-specific code! One of the main problems with Hibernate!
When working with Hibernate, I use an installer class that runs from the command-line and has options for creating database schema, inserting base data, and dynamically updating the database schema using SchemaUpdate. I find it to be extremely useful. It also gives me a place to put one-off scripts that will be run when a new version is launched to, for example, populate a new field in an existing DB table.

Categories

Resources