I'm working with Spring data mongoDB and using aggregation to fetch docuemnts.
List<AggregationOperation> operationsList = new ArrayList<AggregationOperation>();
operationsList.add(Aggregation.unwind("calendarEvent"));
operationsList.add(Aggregation.match(criteria));
operationsList.add(getMacroEventProjectionFields());
if (start <= 0) {
start = 1;
}
operationsList.add(Aggregation.skip(start - 1));
if (limit > 0) {
operationsList.add(Aggregation.limit(limit));
}
Aggregation aggregation = Aggregation.newAggregation(operationsList);
AggregationResults<MacroEvent> groupResults = mongoTemplate.aggregate(
aggregation, mongoTemplate.getCollectionName(KALiEvent.class),
MacroEvent.class);
The collection contains thousands of records and it was working fine till yesterday. But today it has started throwing the following exception:
Error [exception: aggregation result exceeds maximum document size (16MB)]
I Googled this issue and found that aggregation can only return the results equal to the size of MongoDB document size, which is 16MB. The only workaround I found for this is to use the $out pipeline which does the following:
Takes the documents returned by the aggregation pipeline and writes them to a specified collection. The $out operator must be the last stage in the pipeline. The $out operator lets the aggregation framework return result sets of any size.
But, I just can't figure out the way that how this can be done using Spring data MongoDB and how I can use this $out in my code pasted above.
In order to avoid that 16 mb exception, I've created my own OutOperation class that performs mongo $out pipeline operation:
package com.myop;
import com.mongodb.BasicDBObject;
import com.mongodb.DBObject;
import org.springframework.data.mongodb.core.aggregation.AggregationOperation;
import org.springframework.data.mongodb.core.aggregation.AggregationOperationContext;
public class OutOperation implements AggregationOperation {
/**
the name of the collection where aggregation data will be stored
*/
private String collectionName;
public OutOperation(String collectionName) {
this.collectionName = collectionName;
}
#Override
public DBObject toDBObject(AggregationOperationContext context) {
return new BasicDBObject("$out", collectionName);
}
}
Related
I am trying to get data by pagination from CosmosTemplate.paginationQuery(), but the problem is I am not getting data from the offset that I am setting in pagination object. Below is my code for setting pagination,
DocumentQuery documentQuery = new DocumentQuery(criteriaList, CriteriaType.AND));
if (Objects.nonNull(Offset) && Objects.nonNull(limit)) {
PageRequest cosmosPageRequest = CosmosPageRequest.of(Offset, limit);
documentQuery.with(cosmosPageRequest);
Page<User> page = cosmosTemplate.paginationQuery(documentQuery, User.class, COLLECTION_NAME);
...
}
This always returns me list with first set of objects. So for example when I am setting offset as 11 and limit 10, it is always returning me records with offset 0 to 10. I tried to check library as well but there also no where they are setting offset while retrieving records. Below is the code for the same form azure-cosmosdb library AbstractQueryGenerator.generateCosmosQuery().
protected SqlQuerySpec generateCosmosQuery(#NonNull CosmosQuery query,
#NonNull String queryHead) {
final Pair<String, List<Pair<String, Object>>> queryBody = generateQueryBody(query);
String queryString = String.join(" ", queryHead, queryBody.getFirst(), generateQueryTail(query));
final List<Pair<String, Object>> parameters = queryBody.getSecond();
List<SqlParameter> sqlParameters = parameters.stream()
.map(p -> new SqlParameter("#" + p.getFirst(),
toCosmosDbValue(p.getSecond())))
.collect(Collectors.toList());
if (query.getLimit() > 0) {
queryString = new StringBuilder(queryString)
.append("OFFSET 0 LIMIT ")
.append(query.getLimit()).toString();
}
return new SqlQuerySpec(queryString, sqlParameters);
}
Over here also hard coding for offset is done instead of taking from pagination object. Please can anyone suggest if I am doing anything wrong or getting records based on offset is not supported in this library.
This is a bug in the azure-spring-data-cosmos SDK, where it does not honor the OFFSET as part of CosmosPageRequest and always set it to 0.
It is currently being investigated and will be fixed soon. Follow this issue for updates - https://github.com/Azure/azure-sdk-for-java/issues/28032
However, as a workaround for now, the best way would be to use a custom query using query annotation as mentioned in this example - Usage of offset - https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-spring-data-cosmos-test/src/test/java/com/azure/spring/data/cosmos/repository/integration/ContactRepositoryIT.java#L235
Query annotation - https://github.com/Azure/azure-sdk-for-java/blob/main/sdk/cosmos/azure-spring-data-cosmos-test/src/test/java/com/azure/spring/data/cosmos/repository/repository/ContactRepository.java#L39
I am trying to find how to use mongo Atlas search indexes, from java application, which is using spring-data-mongodb to query the data, can anyone share an example for it
what i found was as code as below, but that is used for MongoDB Text search, though it is working, but not sure whether it is using Atlas search defined index.
TextQuery textQuery = TextQuery.queryText(new TextCriteria().matchingAny(text)).sortByScore();
textQuery.fields().include("cast").include("title").include("id");
List<Movies> movies = mongoOperations
.find(textQuery, Movies.class);
I want smaple java code using spring-data-mongodb for below query:
[
{
$search: {
index: 'cast-fullplot',
text: {
query: 'sandeep',
path: {
'wildcard': '*'
}
}
}
}
]
It will be helpful if anyone can explain how MongoDB Text Search is different from Mongo Atlas Search and correct way of using Atalas Search with the help of java spring-data-mongodb.
How to code below with spring-data-mongodb:
Arrays.asList(new Document("$search",
new Document("index", "cast-fullplot")
.append("text",
new Document("query", "sandeep")
.append("path",
new Document("wildcard", "*")))),
new Document())
Yes, spring-data-mongo supports the aggregation pipeline, which you'll use to execute your query.
You need to define a document list, with the steps defined in your query, in the correct order. Atlas Search must be the first step in the pipeline, as it stands. You can translate your query to the aggregation pipeline using the Mongo Atlas interface, they have an option to export the pipeline array in the language of your choosing. Then, you just need to execute the query and map the list of responses to your entity class.
You can see an example below:
public class SearchRepositoryImpl implements SearchRepositoryCustom {
private final MongoClient mongoClient;
public SearchRepositoryImpl(MongoClient mongoClient) {
this.mongoClient = mongoClient;
}
#Override
public List<SearchEntity> searchByFilter(String text) {
// You can add codec configuration in your database object. This might be needed to map
// your object to the mongodb data
MongoDatabase database = mongoClient.getDatabase("aggregation");
MongoCollection<Document> collection = database.getCollection("restaurants");
List<Document> pipeline = List.of(new Document("$search", new Document("index", "default2")
.append("text", new Document("query", "Many people").append("path", new Document("wildcard", "*")))));
List<SearchEntity> searchEntityList = new ArrayList<>();
collection.aggregate(pipeline, SearchEntity.class).forEach(searchEntityList::add);
return searchEntityList;
}
}
im new to MongoDB and completely confused by the queries. I simply need to update a document in a mongodb database by adding a string value (example: Temperature) to a list of strings. From research I know that I have to use the $push method for that. I think the code has to look somehow like this:
BasicDBObject newDocument = new BasicDBObject().append("$set",
new BasicDBObject().append("Subscribed Topics", topic));
collection.update(new BasicDBObject().append("Sensor Type", sensorType), newDocument);
new BasicDBObject("$push",
new BasicDBObject("Subscribed Topics", topic));
The field with the array is called "Subscribed Topics", "topic" is a String (Temperature). Then I want to update the document in the collection with the corresponding "Sensor Type". However, I do not really know how to call the $push part correctly. I hope someone can help me sort this part of the code.
Best regards.
Update, I tried to implemented as suggested in the duplicate question but still got error. Very unsure if thats the right way anyway.
DBObject listItem = new BasicDBObject("Subscribed Topics", "Light");
DBObject updateQuery = new BasicDBObject("$push", listItem);
collection.update(query, updateQuery);`
I create a new Object with the value Light in for Key Subscribed Topics (the array). Why do I push it to a new Object then?
My goodness! This question got me descending into the long forgotten world of Java again - after all these years... ;) Anyhoo, here's a complete working example that might give you a clue of what's going on. You can run the code several times and see how the number of elements in the "Subscribed Topics" array increases.
I used the following driver: https://oss.sonatype.org/content/repositories/releases/org/mongodb/mongo-java-driver/3.3.0/mongo-java-driver-3.3.0.jar
import com.mongodb.MongoClient;
import com.mongodb.client.MongoDatabase;
import com.mongodb.client.MongoCollection;
import org.bson.Document;
import org.bson.conversions.Bson;
import static com.mongodb.client.model.Filters.*;
import static com.mongodb.client.model.Updates.*;
public class MongoDbPush {
public static void main(String[] args)
{
MongoClient mongoClient = new MongoClient();
MongoDatabase database = mongoClient.getDatabase("pushExampleDb");
MongoCollection<Document> collection = database.getCollection("pushExampleCollection");
String sensorType = "Temperature";
// try to load existing document from MongoDB
Document document = collection.find(eq("Sensor Type", sensorType)).first();
if(document == null)
{
// no test document, let's create one!
document = new Document("Sensor Type", sensorType);
// insert it into MongoDB
collection.insertOne(document);
// read it back from MongoDB
document = collection.find(eq("Sensor Type", sensorType)).first();
}
// see what it looks like in JSON (on the first run you will notice that it has got an "_id" but no "Subscribed Topics" array yet)
System.out.println(document.toJson());
// update the document by adding an entry to the "Subscribed Topics" array
Bson filter = eq("Sensor Type", sensorType);
Bson change = push("Subscribed Topics", "Some Topic");
collection.updateOne(filter, change);
// read one more time from MongoDB
document = collection.find(eq("Sensor Type", sensorType)).first();
// see what the document looks like in JSON (on the first run you will notice that the "Subscribed Topics" array has been created and has got one element in it)
System.out.println(document.toJson());
mongoClient.close();
}
}
The above method still works, however, with updated Mongo Driver the below is also a viable mechanism.
The below works for Mongo Driver 3.6 onward (in this case using 3.12.4)
MongoClient mongoClient = new MongoClient();
MongoDatabase database = mongoClient.getDatabase("pushExampleDb");
MongoCollection<Document> collection = database.getCollection("pushExampleCollection");
collection.findOneAndUpdate(Filters.eq("Sensor Type",<theSensorTypeNameComesHere>),
Updates.pushEach("Subscribed Topics",<listContainingTheValuesComeHere>));
Refer: $push and $each from MongoDB Manual
I'm using the Spring framework to perform an aggregation on my mongodb. However, the lookup keeps failing and I can't understand why. Here's the query:
Aggregation aggregation = newAggregation(
match(Criteria.where("idOfUser").is(loggedInAccount.getId())),
group("imgID"),
new CustomAggregationOperation(
new BasicDBObject("$lookup",
new BasicDBObject("from","img")
.append("localField","_id")
.append("foreignField","_id")
.append("as","uniqueImgs")
)
),
limit(pageable.getPageSize()),
skip(pageable.getPageSize()*pageable.getPageNumber())
);
AggregationResults aggregationResults = mongo.aggregate(aggregation, "comment", String.class); //Using String at the moment just to see the output clearly.
CustomAggregationOperation is as follows:
public class CustomAggregationOperation implements AggregationOperation {
private DBObject operation;
public CustomAggregationOperation (DBObject operation) {
this.operation = operation;
}
#Override
public DBObject toDBObject(AggregationOperationContext context) {
return context.getMappedObject(operation);
}
}
The Spring MongoDB version of lookup isn't recognised which is why I'm using this CustomAggregationOperation. AFAIK it shouldn't affect it.
Ideally what I want to happen is:
Get all the comments of the user.
Make sure that the imgID is distinct for the comments (so there are only the id's of the imgs that have been commented on)
Get the actual img objects related to these ids.
Paginate the returned imgs.
At the moment, step 3 doesn't work, and I think 4 wouldn't work either since limit and skip won't be applied to the objects in "uniqueImgs".
What is returned is:
[{ "_id" : "570e2f5cb1b9125510a443f5" , "uniqueImgs" : [ ]}]
How can I fix this?
EDIT
the imgID stored isn't an ObjectID whereas the _id in the img collection is. Would that have any effect?
The current release (at the time of writing 1.9.5) has support for the $lookup operator and can be implemented as (untested):
LookupOperation lookupOperation = LookupOperation.newLookup()
.from("img")
.localField("_id")
.foreignField("_id")
.as("uniqueImgs");
Aggregation agg = newAggregation(
match(Criteria.where("idOfUser").is(loggedInAccount.getId())),
group("imgID"),
lookupOperation,
limit(pageable.getPageSize()),
skip(pageable.getPageSize()*pageable.getPageNumber())
);
AggregationResults aggregationResults = mongo.aggregate(agg, "comment", String.clas);
I would like to create method that composes db query using optional query parts.
In old days I could write method like:
import com.mongodb.DBObject;
import com.mongodb.QueryBuilder;
public DBObject createQuery(Optional<String> aOpt, Optional<Integer> bOpt) {
QueryBuilder builder = new QueryBuilder();
aOpt.ifPresent(a -> builder.and("a").is(a));
bOpt.ifPresent(b -> builder.and("b").lessThan(b));
return builder.get();
}
how to accomplish it with new MongoDB Java driver 3.0
I tried something like
import org.bson.Document;
import com.mongodb.client.model.Filters;
public Document createQuery(Optional<String> aOpt, Optional<Integer> bOpt) {
Document query = new Document();
aOpt.ifPresent(a -> query.put("a", a));
bOpt.ifPresent(b -> query.[nonExistingMethod](Filters.lt("b", b)));
return query;
}
But all filters methods returns BSON which could not be converted to Document fluently. It seems that Filters could not be used that way... pretty useless.
Do I miss something.
How to do it wisely?