Play Framework Database Search - java

I have a model class "Journey" in my project which has several methods to delete, create and list all of the journeys. I am using heroku and a postgresql database. I need to write a method that will return all journeys that have a similar address to one specified. I know the query structure would typically be something like SELECT address FROM Journey WHERE address ~~ arguement but I don't know what functions exist to do this in the play framework.
*public static void search(String address){
//query
//return matching journey results
}*

You need to use Model's Finder for an example:
package models;
import play.db.ebean.Model;
import javax.persistence.*;
#Entity
public class Journey extends Model {
#Id
public Integer id;
public static Finder<Integer, Journey> find
= new Model.Finder<>(Integer.class, Journey.class);
// other fields
public String address;
public String country;
}
so you can easily select records with:
List<Journey> allJourneys = Journey.find.all();
List<Journey> searchedJourneys = Journey.find.where().like("address", "%foo%").findList();
Journey firstJourney = Journey.find.byId(123);
In your base case you can add this to your model:
public static List<Journey> searchByAddress(String address){
return find.where().like("address", "%"+address+"%").findList();
}
Etc. It returns whole objects with relations, so in big data sets it can be too heavy, you can or even should also use more optimized queries with Finder's chained methods like select(), fetch() etc to point which data you need at the moment.
There are also other possibilities in Ebean's API, anyway you need to declare which approach is most optimal for you.
BTW, it's worthy to examine existing sample applications, for an example computer's database to get familiar with this ORM.
Edit
For case insesitive searching there are additional Expressions i.e. ilike (instead of like) , istartsWith, iendsWith, ieq, icontainsand iexampleLike. They does the same what version without i at the beginning.
You can preview them in the API as well.

Related

How to customize the API metadata of springdoc-openai?

I'm trying to customize the springdoc-openapi, make it can work with my framework, but I meet two problems.
1. How to treat methods that do not start with is/get as properties of Model?
If users use my ORM framework by Java language, the property getters in the entity interface can either start with is/get like a traditional Java Bean, or don't start with is/get like a Java record, for example
#Entity
public interface Book {
#Id
long id();
String name();
int edition();
BigDecimal price();
#ManyToOne
BookStore store();
#ManyToMany
List<Author> authors();
}
Here, the wording that does not start with is/get is used, which looks like a java record, not a traditional java bean.
However, doing this will cause swagger-ui to think that the model doesn't have any attributes. So I have to change the behavior of swagger.
After some research, I found that this behavior can be changed using io.swagger.v3.core.converter.ModelConverter, which is the most likely solution.
However, springdoc-openapi does not explain in detail how to use ModelConverter in the documentation. Ultimately, this goal was not achieved.
2. How to control the shape of dynamic objects in HTTP response?
My ORM is GraphQL-style, its entity objects are dynamic so that data structures of arbitrary shapes can be queried, just like GraphQL does. For example
#RestController
public class BookController {
#AutoWired
private JSqlClient sqlClient;
// Query simple book objects
#GetMapping("/books")
public List<Book> books() {
return sqlClient.getEntities().findAll(Book.class);
}
// Query complex book objects
#GetMapping("/books/details")
public List<Book> bookDetails() {
return sqlClient.getEntities().findAll(
// Like the request body of GraphQL
BookFetcher$
.allScalarFields()
.store(
BookStoreFetcher.$.allScalarFields()
)
.authors(
AuthorFetcher.$.allScalars()
)
);
}
}
The first query returns a list of simple book objects in the format {id, name, edition, price}
The second query returns a list of complex book objects in the format {id, name, edition, price, store: {id, name, website}, authors: {id, firstName, lastName, gender}}
Dynamic objects can vary in shape, and these are just two special cases.
I expect swgger to tell the client the shape of the object returned by each business scenario. So, I defined an annotation called #FetchBy. It should be used like this
#RestController
public class BookController {
private static final Fetcher<Book> BOOK_DETAIL_FETCHER =
BookFetcher$
.allScalarFields()
.store(
BookStoreFetcher.$.allScalarFields()
)
.authors(
AuthorFetcher.$.allScalars()
);
#AutoWired
private JSqlClient sqlClient;
#GetMapping("/books")
public List<Book> books() {
return sqlClient.getEntities().findAll(Book.class);
}
#GetMapping("/books/details")
public List<#FetchBy("BOOK_DETAIL_FETCHER") Book> bookDetails() {
return sqlClient.getEntities().findAll(BOOK_DETAIL_FETCHER);
}
}
Declare the shape of the complex object as a static constant.
The #FetchBy annotation uses the constant name to tell swgger the shape of the returned dynamic object.
After some research, I found that this behavior can be changed using org.springdoc.core.customizers.OperationCustomizer, which is the most likely solution.
However, I found that the schema tree of swagger is not consistent with the generic type definition tree in the java language. For example, Spring's ResponseEntity<> wrapper will be ignored by swagger and will be not parsed as a node of schema tree. Theoretically speaking, this ability of swagger can be customized infinitely, so the two trees may not always be consistent and difficult to analyze.

Load the embedded class directly without the embedding class in Morphia

I've the following two simple classes, which are exemplary for the structure of my problem:
The first class, which embeds the second one
#Entity
public class MyClass {
#Id
private String myClassName;
private String otherField;
#Embedded
private List<MyEmbedded> myEmbeddeds;
}
And the second class which will be embedded:
#Embedded
public class MyEmbedded {
#Id
private String name;
private String some;
private String other;
}
In the real case, both classes have a far more complicated structure, with a lot of more fields and references.
Due to that, i don't want to load the whole MyClass object, as in most cases I only need one specific element from the MyEmbedded list (in most cases with a read-only access).
On the other hand, setting the MyEmbedded class as a simple reference is no option, as we have some complex queries for the MyClass which heavily depend on the myEmbeddeds, which would mean that we would have to execute multiple queries, which is not wanted.
So, the main question is:
How can I load one specific element of the myEmbeddeds list directly as a MyEmbedded-object, without loading the "parent"-object?
Maybe there is a way by using the AggregationPipeline? ( you can define a "target" class in pipeline.aggregate() method and one can find some examples in the tests of morphia as you can see here but i didn't get that working for my case)
You could query MyClass based on attributes of MyEmbedded and then use a projection to only pull myEmbeddeds from the results.

Is it ok to pass interface of DTO to DAO

It's about passing interface of DTO to DAO.
For example I have following code
public interface User {
String getName();
}
public class SimpleUser implements User {
protected String name;
public SimpleUser(String name) {
this.name = name;
}
#Override
public String getName() {
return name;
}
}
// Mapped by Hibernate
public class PersistentUser extends SimpleUser {
private Long id;
// Constructor
// Getters for id and name
// Setters for id and name
}
I'm using generic DAO. Is it ok if I create DAO with using interface User instead PersistentUser?
User user = new PersistentUser(name);
UserDao.create(user);
I read a lot of topics on stack but not figured out is this approach ok or no. Please help me. Maybe this is stupid and I can achive only problems.
About separating beans.
I did this because some classes I want to share via API module, that can be used outside to create entities and pass them to my application. Because they uses interface I developed so I can pass them to my DAO for persisting.
Generally, I would say it is ok, but there are a few hidden problems. A developer could cast the object down or access some state via a toString method that shouldn't be accessible. If you don't be careful, it could happen that state is serialized as JSON/XML in webservices that shouldn't be serialized. The list goes on.
I created Blaze-Persistence Entity Views for exactly that use case. You essentially define DTOs for JPA entities as interfaces and apply them on a query. It supports mapping nested DTOs, collection etc., essentially everything you'd expect and on top of that, it will improve your query performance as it will generate queries fetching just the data that you actually require for the DTOs.
The entity views for your example could look like this
#EntityView(PersistentUser.class)
interface User {
String getName();
}
Querying could look like this
List<User> dtos = entityViewManager.applySetting(
EntityViewSetting.create(User.class),
criteriaBuilderFactory.create(em, PersistentUser.class)
).getResultList();

Morphia - change class associated with a collection

I'm trying to phase out an older java codebase that uses MongoDB/Morphia. During this transition, I'd like the new platform to write to the same MongoDB database/collections so that each can live side by side for a little while. That part I'm doing alright with. My issue is that in the new platform, I need a different package/class structure for the objects I'm mapping with morphia than what is currently in the collection.
For instance, in the old platform I've got this class:
package com.foo;
#Entity
public class Bar {
#Id private String id;
private String name;
...
}
In my mongo database, I now have a collection "Bar" and its documents have the className attribute set to "com.foo.Bar". That's all wonderful.
What I'd like to do in the new platform is create a brand new class in a different package to represent that entity, but have it interact with mongo in the same way. I'm hoping to be able to do something like this:
package com.foo.legacy;
#Entity("com.foo.Bar")
public class LegacyBar {
#Id private String id;
private String name;
...
}
I realize the above doesn't work, but if I change the annotation to #Entity("Bar") I don't get any errors, but when I look up entities by id, I always get null back.
So... is there any way for me to have 2 separate VMs with 2 class structures and 2 different configurations of Morpha such that each can write to the same database/collection in the same fashion?
If I change LegacyBar to just "Bar" and create it in a package called "com.foo" then everything works as expected. I would just REALLY prefer to have the flexibility to quarantine all of this legacy data in a semi-clean fashion.
Do you even need the className attribute?
You can disable it with
#Entity(value = "Bar", noClassnameStored = true)
and drop the attribute in the database.
Quoting the official documentation:
Why would you need it?
This is mainly used when storing different
entities in the same collection and reading them back as the base or
super class.
If you don't do this, it should be an easy workaround to allow different package structures.

Hibernate polymorphic query

I have two classes, Person and Company, derived from another class Contact. They are represented as polymorphically in two tables (Person and Company). The simplified classes look like this:
public abstract class Contact {
Integer id;
public abstract String getDisplayName();
}
public class Person extends Contact {
String firstName;
String lastName;
public String getDisplayName() {
return firstName + " " + lastName;
}
}
public class Company extends Contact {
String name;
public String getDisplayName() {
return name;
}
}
The problem is that I need to make a query finding all contacts with displayName containing a certain string. I can't make the query using displayName because it is not part of either table. Any ideas on how to do this query?
Because you do the concatenation in the Java class, there is no way that Hibernate can really help you with this one, sorry. It can simply not see what you are doing in this method, since it is in fact not related to persistence at all.
The solution depends on how you mapped the inheritance of these classes:
If it is table-per-hierarchy you can use this approach: Write a SQL where clause for a criteria query, and then use a case statement:
s.createCriteria(Contact.class)
.add(Restrictions.sqlRestriction("? = case when type='Person' then firstName || ' '|| lastName else name end"))
.list();
If it is table per-concrete-subclass, then you are better of writing two queries (since that is what Hibernate will do anyway).
You could create a new column in the Contact table containing the respective displayName, which you could fill via a Hibernate Interceptor so it would always contain the right string automatically.
The alternative would be having two queries, one for the Person and one for the Company table, each containing the respective search logic. You may have to use native queries to achieve looking for a concatenated string via a LIKE query (I'm not a HQL expert, though, it may well be possible).
If you have large tables, you should alternatively think about full-text indexing, as LIKE '%...%' queries require a full table scan unless your database supports full text indexes.
If you change displayName to be a mapped property (set to the name column in Company and to a formula like first||' '||last in Person), then can query for Contract and Hibernate will run two queries both of which now have a displayName. You will get back a List of two Lists, one containing Companies and one containing Persons so you'll have to merge them back together. I think you need to query by the full package name of Contract or set up a typedef to tell Hibernate about it.

Categories

Resources