In my db every table has 4 common columns - DATE_CREATED, USER_CREATED, DATE_MODIFIED, USER_MODIFIED, and I want to propagate this rule to all new tables implicitly.
Is it possible to do it without having to generate liquibase script manually?
This is not possible using liquibase (as far as I know).
The reason for this is simple:
What if you change your mind and add/remove one of the default columns later? If you want to change all tables then this is not possible with liquibase as this would mean changing all changesets, which is not allowed.
Use a DSL to generate your liquibase scripts then you can add a certain set of columns to every entity but an automatic way would be difficult with the way liquibase works.
There is nothing built into Liquibase to support this.
Your easiest option would be to use XML document entities which is purely XML-level and therefore transparent to Liquibase. They would allow you to attach common XML into your changelog files.
A more complex approach would be to use the Liquibase extension system (http://liquibase.org/extensions) which allows you to redefine the logic to convert changeSets into SQL. That would allow you to inject any logic you want, including common data types, standard columns, or anything else.
I do not think so.
My suggesion, Dont add above mentioned 4 columns in all tables because there are possible to keep null values in all table for existing entries.
please create a table like Primary key id, table or entity name and your four column name.
Related
Currently I am working on a jooq project where I need to perform schema validation of the columns.
Whats the best way to get Table schema using jooq using table name.
DSLContext.meta() is taking so much time to get schema .
Thanks in advance
By default, DSLContext.meta() queries your entire database with all the schemas and all the tables, even if you only consume parts of it.
You can use Meta.filterSchemas() (and possibly even Meta.filterTables()) to filter out content prior to querying it.
I am working on an JavaEE application, and there are almost 1000+ tables in the database, now I have to query the records by the parametes from the client.
Generally I will create one Entity for each table, and create the Dao,Service to do the query.
However I meet two problems:
1 Number of the tables
As I said, 1000+ table with almost 40+ columns for each, it would a nightmare to create the entity one by one.
2 Scheme update
Even I can create the Entity by program, the schema of the data may change sometime which is out of my control.
And in my application, only read operations are related to these kinds of data,no update,delete,create required.
So I wonder if the following solution is possible:
1 Use Map instead of POJOs
Do not create POJOs at all, use the native Map to wrap the columns and values.
2 Row mapping
When querying using Hibernate or Spring JdbcTemplate or something else, use a mapper to map each row to an entry in the map.
If yes, I would use the ResultMetaData to detect the column name,type,value:
ResultMetaData rmd=rs.getMetaData();
for(int i=0;i<rmd.getColumnCount();i++){
Type t=rmd.getType(i)
if(t==....){
...
}else if(t=...){
...
}
}
Looks like part of JPA's job, any library can used here?
If not, any other alternatives?
Each table in my database have these fixed columns:
record_version
record_timestamp
create_time
Thanks to jooq feature record_version and record_timestamp are automatically handled.
Is it possible to centralize and automate the maintenance of create_time ?
Internally jooq is already managing the first two special fields.
Can I put a handler somewhere to fill the create_time value?
Doing so, I could remove some boilerplate code to initialize this field.
I have another field on each table: update_by_account; anyway If I'm able to manage the previous mentioned field (create_time) I think I'll handle this field too.
thanks in advance
Future jOOQ versions:
What you're looking for is the Listener API feature for Record and UpdatableRecord. It has not yet been implemented as of jOOQ 3.0
In the future, this feature will allow to inject some behaviour to jOOQ's records, when they are stored, inserted, updated, deleted, etc. This would include what you are requesting as well as what jOOQ is currently doing with its
record_version
record_timestamp
columns used for optimistic locking.
A solution for jOOQ 2.x or 3.0:
What you can do right now is implement an ExecuteListener and let it "detect" if it should become active and override a bind value for create_time. This will probably have to resort to parsing or at least regex-matching your SQL statement.
A SQL-only solution:
However, the best option for create_time, in my opinion, is to write a trigger for every one of your tables (Oracle syntax):
CREATE OR REPLACE TRIGGER my_trigger
BEFORE INSERT
ON my_table
REFERENCING NEW AS new
FOR EACH ROW
BEGIN
:new.create_time = SYSDATE;
END t_triggers_trigger;
This will guarantee that the value is available no matter how you access your database...
I have a use case where in I need to read rows from a file, transform them using an engine and then write the output to a database (that can be configured).
While I could write a query builder of my own, I was interested in knowing if there's already an available solution (library).
I searched online and could find jOOQ library but it looks like it is type-safe and has a code-gen tool so is probably suited for static database schema's. In the use case that I have db's can be configured dynamically and the meta-data is programatically read and made available for write-purposes (so a list of tables would be made available, user can select the columns to write and the insert script for these column needs to be dynamically created).
Is there any library that could help me with the use case?
If I understand correctly you need to query the database structure, display the result to via a GUI and have the user map data from a file to that structure?
Assuming this is the case, you're not looking for a 'library', you're looking for an ETL tool.
Alternatively, if you're set on writing something yourself, the (very) basic way to do this is:
the structure of a database using Connection.getMetaData(). The exact usage can vary between drivers so you'll need to create an abstraction layer that meets your needs - I'd assume you're just interested in the table structure here.
the format of the file needs to be mapped to a similar structure to the tables.
provide a GUI that allows the user to connect elements from the file to columns in the table including any type mapping that is needed.
create a parametrized insert statement based on file element to column mapping - this is just a simple bit of string concatenation.
loop throw the rows in the file performing a batch insert for each.
My advice, get an ETL tool, this sounds like a simple problem, but it's full of idiosyncrasies - getting even an 80% solution will be tough and time consuming.
jOOQ (the library you referenced in your question) can be used without code generation as indicated in the jOOQ manual:
http://www.jooq.org/doc/latest/manual/getting-started/use-cases/jooq-as-a-standalone-sql-builder
http://www.jooq.org/doc/latest/manual/sql-building/plain-sql
When searching through the user group, you'll find other users leveraging jOOQ in the way you intend
The setps you need to do is:
read the rows
build each row into an object
transform the above object to target object
insert the target object into the db
Among the above 4 steps, the only thing you need to do is step 3.
And for the above purpose, you can use Transmorph, EZMorph, Commons-BeanUtils, Dozer, etc.
Is there a way to tell hibernate's hbm2ddl to not create specific table but still have the model be recognized by Hibernate.
The thing is that the model map to a view and I want to have an in-memory database (empty on startup and deleted on termination) for testing, hence, having 2 sets of mapping is out of the question.
Okay, this doesn't exactly answer the question (there's probably no way to do it with current version) but it does solve the issue at hand.
So, in the end I let hibernate create the table but later on forcefully drop it and put in my own create view statement. It seems that there are 2 ways to do it.
The first way is by using the <database-object> element, specifically the child element called <create>, like so:
<class table="MY_VIEW"></class>
<database-object>
<create>
drop table MY_VIEW;
create view MY_VIEW etc etc;
</create>
</database-object>
The other way is by entering the same thing in the import.sql. This thing is undocumented. I don't know why, perhaps it's deprecated. I assume it is so, hence I won't put too much detail here. It's not deprecated, but I find the previous method less painful (the create view is several lines long).
Is there a way to tell hibernate's hbm2ddl to not create specific table
AFAIK, hbm2ddl is "all or nothing", you can't exclude specific tables. But you could use it to output the generated DDL to a file instead of automatically exporting it to the database if you want to alter the DDL. Would this help?
but still have the model be recognized by Hibernate.
I didn't get that part. Do you mean having Hibernate validate the database against the mapping?
I had a similar problem. I'm trying to extend an existing schema, so I only want my "new" tables to be created (dropped/altered/etc). I couldn't find any way to tell the hbm2ddl tool to use these entities in its model for validation, but not to generate SQL for them.
So I wrote a simple Perl script to remove those statements from the generated SQL. It's designed to work in a shell script pipeline, like so:
cat your-sql-file.sql | scrub-schema.pl table1 table2 table3 ... > scrubbed.sql
The code is available here (uses the Apache v2 license):
https://github.com/cobbzilla/sql-tools/blob/master/scrub-schema.pl
I hope this is helpful.