Recently I am working on a bilingual project and for some reference tables I need to sort the data. But because it is bilingual, the data is coming from different languages (in my case English and French) and I like to sort them all together, for example, Île comes before Inlet.
Ordinary Order By will put Île at the end of the list. I finally came up with using nativeQuery and sort the data using database engine's function (in oracle is about using NLS_SORT)
But I am tight with database engine and version, so for example if I change my database to postgres then the application will break. I was looking for native JPA solution (if exists) or any other solutions.
To archive this, without use native query JPA definition, I just can see two ways:
Create a DB view which includes escaped/translated columns based on DB functions. So, the DB differences will be on the create view sentence. You can define a OneToOne relation property to original entity.
Create extra column which stores the escaped values and sort by it. The application can perform the escape/translate before store data in DB using JPA Entity Listeners or in the persist/merge methods.
Good luck!
Related
I have a mySQL relational database with football statistics that contains a table matches. I created a method in my Spring project to build a standings table. This method uses a projection because I need each match object to include the two team objects. This response (get all matches + get the two teams in each match) takes around 7 seconds.
The same information but within a View in my database takes 0.231 seconds.
I'm very new to Spring Data so my question is. Should I use table views when I need to join tables? Is there any advice against doing so?
I don't see any problem with using table views. You can map to them with JPA #Table annotation.
The only potential problem is when migrating databases (you will have to make sure Views are migrated correctly).
Hope this helps.
We are migrating a whole application originally developed in Oracle Forms a few years back, to a Java (7) web based application with Hibernate (4.2.7.Final) and Hibernate Search (4.1.1.Final).
One of the requirements is: as users are using the new migrated version, they able to use the Oracle Forms version - so Hibernate Search indexes will be out of sync. Is it feasable to implement a servlet so that some PL-SQL accesses some link that updates the local indexes in the application server (AS)?
I thought of implementing a some sort clustering mechanism for hibernate, but as I read through the documentation I realised that as clustering may be a good option for scalabillity and performance, for maintaining legacy data in sync may be a bit overkill.
Does anyone have any idea of how to implement a service, accessible via servlet, to update local AS indexes in a given model entity with a given ID?
I don't know what exactly you mean by the clustering part, but anyways:
It seems like you are facing a similar problem like me. I am currently in the works of creating a Hibernate-Search adaption for JPA providers (that are not Hibernate-ORM, meaning EclipseLink, TopLink, etc.) and I am working on an automatic reindexing feature at the moment. Since JPA doesn't have a event system suitable for reindexation with Hibernate-Search I came up with the idea to use triggers on a database level to keep track of everything.
For a basic OneToOne relationship it's pretty straight forward and for other things like relation-tables or anything that is not stored in the main table of an entity it gets a bit trickier, but once you got a system for OneToOne relationships it's not that hard to get to that next step. Okay, Let's start:
Imagine two Entities: Place and Sorcerer in the Lord of the rings universe. In order to keep things simple let's just say they are in a (quite restrictive :D) 1:1 relationship with each other. Normally you end up with 2 tables named SORCERER and PLACE.
Now you have to create 3 triggers (one for CREATE, one for DELETE and one for UPDATE) on each Table (SORCERER and PLACE) that store information about what entity (only the id, for mapping tables there are always multiple ids) has changed and how (CREATE, UPDATE, DELETE) into special UPDATE tables. Let's call these PLACE_UPDATES and SORCERER_UPDATES.
In addition to the ID of the original Object that has changed and the event-type these will need an ID field that is needed to be UNIQUE among all UPDATE tables. This is needed because if you want to feed information from the Update tables to the Hibernate-Search index you have to make sure the events are in the right order or you will break your index. How such an UNIQUE ID can be created on your database should be easy to find on the internet/stackoverflow.
Okay. Now that you have set up the triggers correctly you will just have to find a way to access all the UPDATES tables in a feasible fashion (I do this via querying from multiple tables at once and sorting each query by our UNIQUE id field and then just comparing the first result of each query with the others) and then update my index.
This can be a bit tricky and you have to find the correct ways of dealing with the specific update event but it can be done (that's what I am currently working on).
If you're interested in that part, you can find it here:
https://github.com/Hotware/Hibernate-Search-JPA/blob/master/hibernate-search-db/src/main/java/com/github/hotware/hsearch/db/events/IndexUpdater.java
The link to the whole project is:
https://github.com/Hotware/Hibernate-Search-JPA/
This uses Hibernate-Search 5.0.0.
I hope this was of help (at least a little bit).
And about your remote indexing problem:
The update tables can easily be used as some kind of dump for events until you send them to the remote machine that is to be updated.
I have followed Balusc's 1st method to create dynamic form from fields defined in database.
I can get field names and values of posted fields.
But I am confused about how to save values into database.
Should I precreate a table to hold values after creating form and
save values there manually (by forming SQL query manually)?
Should I convert name/value pairs to JSON objects
and save?
Should I create a simple table with id,name,value field and
save name/value pairs here (Like EAV Scheme)?
Or is there any way for persisting posted values into database?
Regards
It look like that you're trying to work bottom-up instead of top-down.
The dynamic form in the linked answer is intented to be reused among all existing tables without the need to manually create separate JSF CRUD forms on "hardcoded" Facelets files for every single table. You should already have a generic model available which contains information about all available columns in the particular DB table (which is Field in the linked answer). This information can be extracted dynamically and generically via JPA metadata information (how to do that in turn depends on the JPA provider used) or just via good 'ol JDBC ResultSetMetaData class once during application's startup.
If you really need to work bottom-up, then it gets trickier. Creating tables/columns during runtime is namely a very bad design (unless you intend to develop some kind of DB management tool like PhpMyAdmin or so, of course). Without the need to create tables/columns runtime, you should basically have 3 tables:
1 table which contains information about which "virtual" DB tables are all available.
1 table which contains information which columns one such "virtual" DB table has.
1 table which contains information which values one such column has.
Then you should link them together by FK relationships.
I am taking a 'Keyword' and table name from user.
Now, I want to find all the columns of table whose data type is varchar(String).
Then I will create query which will compare the keyword with those column and matching rows will be returned as result set.
I tried desc table_name query, but it didn't work.
Can we write describe table query in JPQL?
If not then is there any other way to solve above situation?
Please help and thank you in advance.
No workaround is necessary, because it's not a drawback of the technology. It is not JPQL that needs to be changed, it's your choice of technology. In JPQL you cannot even select data from a table. You select from classes, and these can be mapped to multiple tables at once, resulting in SQL joins for simplest queries. Describing such a join would be meaningless. And even if you could describe a table, you do not use names of columns in JPQL, but properties of objects. Describing tables in JPQL makes no sense.
JPQL is meant for querying objects, not tables. Also, it is meant for static work (where classes are mapped to relations once and for good) and not for dynamic things like mapping tables to objects on-the-fly or live inspection of database (that is what ror's AR is for). Dynamic discovery of properties is not a part of that.
Depending on what you really want to achieve (we only know what you are trying to do, that's different) you have two basic choices:
if you are trying to write a piece of software in a dynamic way, so that it adjusts itself to changes in schema - drop JPQL (or any other ORM). Java classes are meant to be static, you can't really map them to dynamic tables (or grow new attributes). Use rowsets, they work fine and they will let you use SQL;
if you are building a clever library that can be shared by many projects and so has to work with many different static mappings, use reflection API to find properties of objects that you query for. Names of columns in the table will not help you anyway, since in JPQL queries you have to use names defined in mappings.
Map the database dictionary tables and read the required data from them. For Oracle database you will need to select from these three tables: user_tab_comments, user_tab_cols, user_col_comments; to achieve the full functionality of the describe statement.
There are some talks over the community about dynamic definition of the persistent unit in the future releases of JPA: http://www.oracle.com/goto/newsletters/javadev/0111/blogs_sun_devoxx.html?msgid=3-3156674507
According to me, we can not use describe query in jpql.
I am trying to create an application in java which pulls out records from the database and maps it to objects. It does that without knowing what the schema of the database looks like. All i want to do is fetch all rows from all tables and store them somewhere. There could be a thousand tables with thousands of records each. The application doesn't know the name of any table or attribute. It should map "on the fly". I looked at hibernate but it doesnt give me what i want for this app. I don't want to create hard-coded xml files and classes for mapping. Any ideas how i can accomplish this ?
Thanks
Oracle has a bunch of data dictionary views for metadata.
ALL_TABLES, ALL_TAB_COLUMNS would be first places to start. Then you'd build ad-hoc queries based on what you get out of there. Not sure whether you have to deal with all data types (dates, blobs, spatial, user-defined....).
Not sure what you mean by "store them somewhere". If you start thinking CSV or XML files, you'll need to escape various characters from VARCHAR2 columns.
If you are looking for some generic extract/unload routines, you should look at what is already available in the database or open-source/commercially.
MyBatis provides a pretty simple way to map data results to objects and back, maybe check that out?
http://code.google.com/p/mybatis/
Not to be flip, but for this task, you might want to check out Ruby on Rails and its ActiveRecord approach