Database independency using Hibernate - java

I am using Hibernate for ORM in my Java application. I want to write custom queries combining multiple tables and using DB functions like sum(salary).
I also want to support multiple databases without writing the SQLs again and again for each database. The approach currently followed
is having Stored Procedures specific to each DB (Oracle, MySQL etc) and whichever we want to support, we change the configuration file in the application.
What I am looking for is a solution very generic so that I need not write Stored Procedures or SQLs for every new functionality.

If you really want to keep it portable, you'll need to do it all with HQL.
There's no reason that you couldn't do multi-table joins and aggregate functions in HQL, you just need to limit yourself to the ones it supports.
Once you start doing database-vendor specific things, you are no longer database independent, by definition.

A perfect suite is HIbernate Criterias
Hibernate provides alternate ways of manipulating objects and in turn data available in RDBMS tables. One of the methods is Criteria API which allows you to build up a criteria query object programmatically where you can apply filtration rules and logical conditions.
http://www.tutorialspoint.com/hibernate/hibernate_criteria_queries.htm

Related

Create schema and tables on demand at the runtime

So as the title suggests - I need to create an application (preferably Spring Boot), which will create schemas and tables based on user input. Basically, a rest endpoint will be offered to the clients where they would upload their data model in json format. I'll be parsing the json and constructing the db artifacts (schema and tables) in runtime. And once all the tables are created, provide a rest endpoint (with unique identifier), to the client, to perform CRUD operations on their schema.
The approach I am considering currently is -
Create a super user in db , before deploying the app which will have priviliges to create new schemas and db
Create prepared statements to invoke schema/table creation on demand. The prepared statements will have place holders to take the schema name and table definition.
After proper authentication, allow users to upload their data model definition in json.
Clean the json and invoke the schema/table creation prepared statements.
Few questions that I had in mind -
Since all these DB operations will be invoked from a single super user's account, is it safe ?
The schemas and tables will be realized using native SQL queries instead of Hibernate's ORM capabilities. Is it safe/efficient ?
For the CRUD operations, is it possible to switch the db connection from super user to the client specific schema created in the earlier steps ? Or should I continue using the same super user for the CRUD operations?
It would be nice if it is possible to switch schemas in runtime using Hibernate/Spring-Boot.
What I would like is a general approach to this problem. I do not need any code.
A typical web application already has permissions to DELETE all the data for all the users.
JPA makes your queries slower, not faster. JPA can help with caching, but it doesn’t seem you need this.
Yes, you can have multiple datasources in Spring Boot. Look at this for example: https://www.baeldung.com/spring-abstract-routing-data-source
Be aware that your database might not like having millions of tables. Query planning, maintanance jobs, backups, etc all get performance penalties. Basically, databases are not designed for your use case.

connect to different databases with single query using hibernate

I am working on a legacy application. It uses java, hibernate. The problem is there are sql joins which are getting executed using Hibernate.
These SQls consists of two tables TableA, TableB. The problem is now TableA moved to database at US and TableB moved to database at UK. Means both are at different locations and different schemas. Now I have to migrate the application so that these joins can be executed.
How can I use this join to fetch the data from these two tables or how to configure hibernate to connect to different databases so that the SQL join can be executed.
According to this Q&A:
Doing a join over 2 tables in different databases using Hibernate
... it cannot be done by Hibernate itself.
The other approach to consider would be to use XA to integrate the database. But that is heavy-weight and not likely to be performant. See this Q&A
What is the 'best' way to do distributed transactions across multiple databases using Spring and Hibernate
... with sums it up like this:
The best way to distribute transactions over more than one database is: Don't.
In your case, this is saying is that you should pull the data from the two tables separately and then "merge" them programatically. Clunky.
Alternatively, have a long hard discussion with management about doing something about your organization's split-brain database problem. (For example, could the UK and US databases each hold read-only snapshots of the other sites business-critical tables?)
Please note that the above is substantially "opinion", but I don't think we can do much better than that. My understanding is that there are no "silver bullet" solutions to this difficult problem.

Java-MyBatis with HQL/other generic SQL engine or an API to convert SQL

I am working on a project which uses Activiti1 Library and, for these reason, I am using MyBatis API to execute some native queries against Activiti data base.
The problem is: For some builds I am using Oracle and, for other, MySQL database then, as the dialect of this two data bases is different, it would be necessary to use the MyBatis multi vendor support, however, I didn't like it, because I have to deal with specific statments of each type of data base which makes the maintance a hard task, mainly if necessary to add another data base in the future.
So I would like to know if somehow it is possible to use HQL together with MyBatis or if there are another generic SQL engine that can be used in this case.
Or if someone know a free Java API that converts MySQL queries to Oracle queries. I tried to use Hibernate translation, however it didn't work, because Activiti classes is not using JPA, so it is not mapped :(.
Thanks in advance!
Sandro
In Activiti, we do use the multi-vendor support of MyBatis. However, there are only a few cases where we really needed. Do you have that much 'special' queries?

Distributed query with Hibernate multi-tenancy

I am using Hibernate's multi-tenancy feature via JPA, with a database per tenant strategy. One of my requirements is to be able to run a query against a table that exists in each database but obviously with different data. Is this possible?
Thanks in advance for your time.
Nope. this is not possible because when hibernate runs queries it is already initialized with a connection. MT support in Hibernate is basically done a little "outside of Hibernate" itself. It's kind of feeding hibernate with a proper connection and when it's fed :) it's bound to that connection.
If you need cross-tenant queries you might want to reconsider multitenancy or change JPA provider to the one that support "shared schema approach" e.g. EclipseLink. With shared shema approach you have two choices:
run native query agains table containing mt-aware entities
create additional entity - dont mark it as multitenant - map it to the table containing mt-ware entities and run JPQL query in standard manner

unifying query language for accessing data from heterogeneous database

In my current project, I am trying to unify query language for accessing heterogeneous database. Heterogeneous database means their query language for accessing data is different. For instance, SQL is a query language for accessing data from Apache Derby, while nonSQL for MongoDB.
My question is "Is there any domain specific language, which have been proposed to unify heterogeneous databases ? "
Please feel free to direct me other efforts as well.
That's quite an interesting question. There is at least one proposed solution called UnQL (Unstructured Data Query Language) - http://www.couchbase.com/press-releases/unql-query-language.
I suppose out of the box UnQL will work at least for CouchDB and SQLite. This just seems to be a great step ahead.
Personally I would say such a task seems to be a tricky one because of the conceptual differences between structured and unstructured data approaches. Anyway, it should be relatively easy to develop such a DSL for a well defined SQL and NoSQL data models used by a certain application.
There is a project called Hibernate OGM, which aims to generalize JPQL to NoSQL databases.
From their web page:
Hibernate Object/Grid Mapper (OGM) aims at providing Java Persistence (JPA) support for NoSQL solutions. It reuses Hibernate Core's engine but persists entities into a NoSQL data store instead of a relational database. It reuses the Java Persistence Query Language (JP-QL) to search their data.
I don't tried it out for myself, so I cannot say how usefull it is.
JSONiq can process data from different SQL and NoSQL products.
The open source implementation of JSONiq has connectors for Couchbase, Oracle NoSQL, SQLite, and JDBC.
For instance, the following slidedeck showcase the same query being executed on both Couchbase and MongoDB: https://speakerdeck.com/wcandillon/jsoniq-the-sql-of-nosql
SPARQL is a W3C-standardized query language that works on top of an abstract data model (RDF), rather than a specific type of database, which makes it very suitable as an enabler for heterogeneous database querying.
Implementations of SPARQL exist on top of various NoSQL databases, including native RDF databases (often referred to as triplestores), as well as on top of relational databases.

Categories

Resources