I have a project using Spring Boot 1.5.7 and Spring Data Mongodb 1.10.7. In my entity (POJO annotated with #Document), I want to perform a geospatial query against one of the fields using spring data repos "findByXxxWithin" scheme and passing in a Box containing the bounding box of the search area.
The field I want to query against is a polygon, not a simple point. I have found that a double[] point definition is easy to query and some examples show that you can also use a GeoJsonPoint. For polygons, it doesn't seem that easy. If my entity declares a GeoJsonPoint, the within search using the Box always comes back empty. The GeoJson definition of a polygon is actually a 3 dimensional array of values, in my case, doubles. Defining the data in this manner also results in an empty result. Any attempt to use a of POJO that contains the polygon definition fails. The only success I've had is using a double[][].
I would like to have a more accurate GeoJson representation of the data in my objects that spring data is capable of querying against. Is this possible? Also, there are several other geospatial query operations available to Mongodb, such as $geoIntersects. Are these available through spring data? Or perhaps a lower level api I can use to directly formulate these queries against mongo if spring data does not support them?
Let me try to recite one of my work with MongoDB and Spring data, which resembles your problem statement quite a bit.
I have a document which has georeference (Latitude and Longitude). I have used org.geojson.LngLatAlt object to store longitudes and latitudes. You can also have multiple LngLatAlt objects, as I use a Set (java.util.Set) of them. So, this will solve your problem of representation in your document.
Now, when you have data is present in MongoDB, you can make geospatial queries using Spring data.
At first, it may look like Spring data is quite in-efficient in geospatial queries, you may be tempted to use native MongoDB queries, which is definitely good and efficient. But the thing to learn here is, Spring also provides a way to make such queries; although not very direct but equally efficient.
With Spring Data, you can make spatial queries using org.springframework.data.geo.Box or org.springframework.data.geo.Circle objects. Box is used for BBOX queries and circle is used for sphere queries. Now that you have your org.springframework.data.geo.Shape objects, you can make Criteria objects to Query.
Code snippet -
#Autowired
private MongoTemplate mongoTemplate;
Box bbox = ShapeUtils.getBBox(coordinates);
Query q = new Query(Criteria.where("lngLatAlt").within(bbox));
List<Lead> leads = mongoTemplate.find(q, Lead.class);
Please let me know if my solution is clear and relevant. Or if you need some more clarifications.
Regards
Related
Context:
So i'm writing a RestAPI using Spring boot. I have a very bulky endpoint, which makes a decent amount of queries. The database itself is big, averaging 10 mil rows per table (some hold 1k, one holds 100 mil, most hold 5mil-30mil). First query returns list of string Id's. From that point on we do more operations using these ID's.
General Problem:
In some cases the endpoint is asked for a lot of these entities (1000 or 2000). Once we get them, we don't want to do the following queries using an IN clause, since that would be slow for that many id's. Fortunately Postgresql provides the JOIN VALUES expression, which makes the queries extremely fast, but it's syntax (more like the parameter binding) is unfamiliar to hibernate.
Syntax: https://www.postgresql.org/docs/10/queries-values.html
Specific problem : Passing in a List entityId's in hibernate, makes it do
entityId, entityId, entityId ...
For a values expression, we need it to do:
(entityId), (entityId), (entityId) ...
Question: What would be the best approach to do this? Add this custom binding for parameters? If that is the best approach, how would i go about doing that?
Thank you for any ideas/suggestions in advance!
Looking at the following code:
mongoOps.getCollection("FooBar")
.distinct("_id", query(where("foo").is("bar")).limit(10).getQueryObject());
I would expect this to return only the first 10 distinct _ids of collection FooBar.
But unfortunately, running this against a Collection having more than 10 documents matching the criteria, it returns all of them ignoring the limit(10) specified here.
_id is an ObjectId.
How can I achieve this?
Is it a bug in Spring Data?
I'm already aware how I can achieve this using an aggregate but I'm trying to simplify the code if possible since using an aggregate takes many more lines of code.
FYI: I'm using Spring Data Mongodb 1.10.10 and unfortunately, updating is currently not an option.
I have using maria db to store dynamic columns.In sql Query it is there in the maria db documentation. https://mariadb.com/kb/en/library/dynamic-columns/ .
But i couldn't find a spring boot jpa implementation.I have tried jpa native query.I am storing a Json in this dynamic column which is data type of Blob in maria db.But it is very tough because i couldn't find a way to store when json is nesting another objects or arrays.Is there any way to accomplish this task
JSON is just a string. It could be stored in a TEXT or BLOB column.
The issue is that it is a structured string and it is tempting to reach into it to inspect/modify fields. Don't. It won't be efficient.
Copy values out of the JSON that you need for WHERE or ORDER BY (etc) so that MySQL/MariaDB can efficiently access them. Sure, keep copies inside the JSON text/blob, if you like.
Please describe your use for JSON, there may be more tips on "proper" usage inside a database.
That being said, MariaDB 10.2 has picked up much of Oracle's JSON. But there is not a JSON datatype. And there are a few function differences.
And this does not explain how far behind the 3rd party software (spring-boot, etc) is.
I'm using MongoDB and PostgreSQL in my application. The need of using MongoDB is we might have any number of new fields that would get inserted for which we'll store data in MongoDB.
We are storing our fixed field values in PostgreSQL and custom field values in MongoDB.
E.g.
**Employee Table (RDBMS):**
id Name Salary
1 Krish 40000
**Employee Collection (MongoDB):**
{
<some autogenerated id of mongodb>
instanceId: 1 (The id of SQL: MANUALLY ASSIGNED),
employeeCode: A001
}
We get the records from SQL, and from their ids, we fetch related records from MongoDB. Then map the result to get the values of new fields and send on UI.
Now I'm searching for some optimized solution to get the MongoDB results in PostgreSQL POJO / Model so I don't have to fetch the data manually from MongoDB by passing ids of SQL and then mapping them again.
Is there any way through which I can connect MongoDB with PostgreSQL through columns (Here Id of RDBMS and instanceId of MongoDB) so that with one fetch, I can get related Mongo result too. Any kind of return type is acceptable but I need all of them at one call.
I'm using Hibernate and Spring in my application.
Using Spring Data might be the best solution for your use case, since it supports both:
JPA
MongoDB
You can still get all data in one request but that doesn't mean you have to use a single DB call. You can have one service call which spans to twp database calls. Because the PostgreSQL row is probably the primary entity, I advise you to share the PostgreSQL primary key with MongoDB too.
There's no need to have separate IDs. This way you can simply fetch the SQL and the Mongo document by the same ID. Sharing the same ID can give you the advantage of processing those requests concurrently and merging the result prior to returning from the service call. So the service method duration will not take the sum of the two Repositories calls, being the max of these to calls.
Astonishingly, yes, you potentially can. There's a foreign data wrapper named mongo_fdw that allows PostgreSQL to query MongoDB. I haven't used it and have no opinion as to its performance, utility or quality.
I would be very surprised if you could effectively use this via Hibernate, unless you can convince Hibernate that the FDW mapped "tables" are just views. You might have more luck with EclipseLink and their "NoSQL" support if you want to do it at the Java level.
Separately, this sounds like a monstrosity of a design. There are many sane ways to do what you want within a decent RDBMS, without going for a hybrid database platform. There's a time and a place for hybrid, but I really doubt your situation justifies the complexity.
Just use PostgreSQL's json / jsonb support to support dynamic mappings. Or use traditional options like storing json as text fields, storing XML, or even EAV mapping. Don't build a rube goldberg machine.
Recently I am working on a bilingual project and for some reference tables I need to sort the data. But because it is bilingual, the data is coming from different languages (in my case English and French) and I like to sort them all together, for example, Île comes before Inlet.
Ordinary Order By will put Île at the end of the list. I finally came up with using nativeQuery and sort the data using database engine's function (in oracle is about using NLS_SORT)
But I am tight with database engine and version, so for example if I change my database to postgres then the application will break. I was looking for native JPA solution (if exists) or any other solutions.
To archive this, without use native query JPA definition, I just can see two ways:
Create a DB view which includes escaped/translated columns based on DB functions. So, the DB differences will be on the create view sentence. You can define a OneToOne relation property to original entity.
Create extra column which stores the escaped values and sort by it. The application can perform the escape/translate before store data in DB using JPA Entity Listeners or in the persist/merge methods.
Good luck!