I need to get the last inserted client so i can create an secuencial id, i tried a #Query annotation with the follow path but it doesnt run the app.
public interface ClienteRepository extends MongoRepository<Cliente, String> {
#Query("[{ $sort: ({ _id: -1}).limit:(1)}]")
Cliente findLastCliente();
}```
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'clienteRepository': Invocation of init method failed; nested exception is com.mongodb.util.JSONParseException:
[{ $sort: ({ _id: -1}).limit:(1)}]
You can do it in 3 ways
1: Directly use this. Spring will automatically create desired query for you
Cliente findTopByOrderByIdDesc();
2: You can use Pageable and get only 1 record.
#Query(sort = "{ _id : -1 }");
Page<Cliente> findByMethod(Pageable pageable);
// Inside service
Page fooPage = FooRepository.findByMethod('John', new PageRequest(0,1));
3: Custom Query
Query query = new Query();
query.limit(1);
query.with(new Sort(Sort.Direction.DESC, "_id"));
mongoOperation.find(query, Cliente.class);
I would prefer the 1st option
By the way issue you are getting is because this #Query("[{ $sort: ({ _id: -1}).limit:(1)}]") is not in correct JSON format which can be translate to Mongo Query.
Related
Context
I need to rewrite some code previoulsy with Jongo but using Springframework MongoDb. Previous code was:
eventsCollection
.aggregate("{$match:" + query + "}")
.and("{$group: {_id: '$domain', domain: {$first: '$domain'}, codes: {$push: '$code'}}}")
.and("{$project : { _id: 0, domain: 1 , codes: 1 } }")
.as(DomainCodes.class);
where eventsCollection is Jongo MongoCollection, and query is a String containing with criteria.
Problem
New code must probably look like :
Aggregation myAggregation = Aggregation.newAggregation(
Aggregation.match(/* something here */),
Aggregation.group("domain").first("domain").as("domain").push("code").as("codes")
);
mongoTemplate.aggregate(myAggregation, "collectionName", DomainCodes.class);
but I don't find a way to create match criteria using String (similare as BasicQuery that can take a query as String for argument)
Question
In order to change as little code as possible, is there anyway to use query String as in Jongo ?
Thank you,
Below is the data present in Mongo DB
{"name":"john","id":"123","location":"Pune"}
{"name":"steve","id":"456","location":"Noida"}
I want to upsert the "id" to "789" and "name" to "alex" where "name":"john" and "location":"Pune" and as per upsert funtionality, if the query condition is not present, then it needs to create a new entry.
I am using the below logic to do this using Bson filter, but i am getting the below exception
Bson filter=null;
Bson update=null;
filter=combine(eq("name":"john"),eq("location":"Pune"));
update=combine(eq("id":"123"),eq("name":"alex"));
UpdateOptions options = new UpdateOptions();
options.upsert(true);
dbCollection.updateMany(filter, update,options);
I am expecting below change in my Mongo DB data :
{"name":"alex","id":"789","location":"Pune"}
But I'm getting below Exception :
Exception is java.lang.IllegalArgumentException: Invalid BSON field name portalID
java.lang.IllegalArgumentException: Invalid BSON field name portalID
at org.bson.AbstractBsonWriter.writeName(AbstractBsonWriter.java:532)
Can some one suggest me?
Try the following code:
Bson filter = null;
Bson update = null;
filter = and(eq("name", "john"), eq("location", "Pune"));
update = combine(set("id", "789"), set("name", "alex"));
UpdateOptions options = new UpdateOptions();
options.upsert(true);
dbCollection.updateMany(filter, update, options);
I am trying to update the following JSON doc in mongodb so that a new document will be created if there is not one matching the "altKey", but if there is a document matching the altKey, any matching "records" will have their "domain" set and their "counts" incremented. I have a requirement that the JSON structure not change and that Spring-Data for mongodb is used.
{
"altKey": "value"
"records": {
"randomName1" {
"domain": "domainValue",
"count": 3
},
"randomName2" {
"domain": "domainValue2",
"count": 5
},
...
"randomNameN" {
"domain": "domainValueN",
"count": 4
}
}
}
The relevent portion of the class I have been attempting to do the update with is:
#Autowired
private MongoTemplate mongoTemplate;
#Override
public void increment(Doc doc) {
Query query = new Query().addCriteria(Criteria.where("altKey").is(doc.getAltKey());
Update update = new Update();
update.setOnInsert("altKey", doc.getAltKey());
for (final Map.Entry<String, RecordData> entry :
doc.getRecords().entrySet()) {
String domainKey = format("records.{0}.domain", entry.getKey());
String domainValue = entry.getValue().getDomain();
update.set(domainKey, domainValue);
String countKey = format("records.{0}.count", entry.getKey());
Integer countValue = entry.getValue().getCount();
update.inc(countKey, countValue);
}
mongoTemplate.upsert(query, update, Doc.class);
}
When I attempt to call the increment method the "altKey" field is successfully persisted, but none of the records persist and I am not sure of why that is. I believe the reason is my attempt to use mongo dot operation for the key when doing the set and inc update portions (ie "records.randomNameN.domain" or "records.randomNameN.count") but I haven't been able to find an alternate way to configure the Update object when I don't know until run time what the name of a particular record will be.
Anyone out there know how to set up the Update object to handle setting nested fields?
I'm learning about elastic search and I am trying to retrieve data based on a field value in the table.
I have the table (MySQL) "code" which has a field "code_group_id" and existing data in the table.
Using Typescript and Java I would like to retrieve a List of Code objects with a specific code_group_id. I have prepared the following methods in Java:
#GetMapping("/_search/codes")
#Timed
public ResponseEntity<List<CodeDTO>> searchCodes(#RequestParam String query, Pageable pageable) {
log.debug("REST request to search for a page of Codes for query {}", query);
Page<CodeDTO> page = codeService.search(query, pageable);
HttpHeaders headers = PaginationUtil.generateSearchPaginationHttpHeaders(query, page, "/api/_search/codes");
return new ResponseEntity<>(page.getContent(), headers, HttpStatus.OK);
}
#GetMapping("/codes/currencies")
#Timed
public ResponseEntity<List<CodeDTO>> getAllByCodeGroupId(Pageable pageable) {
QueryBuilder qb = QueryBuilders.termQuery("codeGroupId", 3);
return searchCodes(qb.toString(), pageable);
}
According to the ES documentation, the terms query should be the correct choice here as I am looking for a specific query term, so this is supposed to return a response body containing all "code" records that have code_group_id = 3.
However, when I test the GET command on the REST API I get the following exception:
2018-04-21 21:32:47.024 ERROR 14961 --- [ XNIO-59 task-5]
c.i.s.aop.logging.LoggingAspect : Exception in ch.ice.swingkasso.service.impl.CodeServiceImpl.search() with cause = '[code] QueryParsingException[Failed to parse query [{
"term" : {
"codeGroupId" : 3
}
}]]; nested: ParseException[Cannot parse '{
"term" : {
"codeGroupId" : 3
}
}': Encountered " <RANGE_GOOP> "{\n "" at line 1, column 13.
Was expecting one of:
"]" ...
"}" ...
]; nested: ParseException[Encountered " <RANGE_GOOP> "{\n "" at line 1, column 13.
Was expecting one of:
"]" ...
"}" ...
];' and exception = 'all shards failed'
Caused by: org.elasticsearch.index.query.QueryParsingException: Failed to parse query [{
"term" : {
"codeGroupId" : 3
}
}]
Am I overlooking something simple? Thanks for any pointer in this matter.
Found the solution. The problem was that the search method re-converted the query into a QueryStringQuery hence the parse Error.
I am trying to send request to ES from my tests. I applied mapping and inserted documents to ES index named 'gccount_test' from the same test. I have a very simple query maintained in a file named member that I want to test.
{
"query" : {
"match_all" : {}
}
}
My test method is
public void testMemberQuery(){
final Charset CHARSET = StandardCharsets.UTF_8
//load query
byte[] bytes = Files.readAllBytes(Paths.get(MEMBER_QUERY_PATH))
String query = CHARSET.decode(ByteBuffer.wrap(bytes)).toString()
println "QUERY => ${query}"
SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder()
searchSourceBuilder.query(query)
SearchRequestBuilder searchRequestBuilder = client.prepareSearch(INDEX_NAME)
//ClusterAdminClient adminClient = client.admin().cluster()
//searchRequestBuilder.setTypes(Constants.ESTYPE_MEMBER)
//println "CLUSTER => ${adminClient}"
searchRequestBuilder.setSearchType(SearchType.QUERY_THEN_FETCH);
searchRequestBuilder.internalBuilder(searchSourceBuilder)
SearchResponse searchResponse = searchRequestBuilder.execute().actionGet()
println "Search Response => ${searchResponse.toString()}"
//blah blah
}
Unfortunately, I get following error.
Failed to execute phase [query_fetch], total failure; shardFailures {[1][gccount][0]: SearchParseException[[gccount_test][0]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"query_binary":"ewogICAgInF1ZXJ5IiA6IHsgCiAgICAgICAgICAibWF0Y2hfYWxsIiA6IHt9IAogICAgIH0KfQ=="}]]]; nested: QueryParsingException[[gccount_test] No query registered for [query]]; }
org.elasticsearch.action.search.SearchPhaseExecutionException: Failed to execute phase [query_fetch], total failure; shardFailures {[1][gccount_test][0]: SearchParseException[[gccount_test][0]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"query_binary":"ewogICAgInF1ZXJ5IiA6IHsgCiAgICAgICAgICAibWF0Y2hfYWxsIiA6IHt9IAogICAgIH0KfQ=="}]]]; nested: QueryParsingException[[gccount_test] No query registered for [query]]; }
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:261)
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$3.onFailure(TransportSearchTypeAction.java:214)
at org.elasticsearch.search.action.SearchServiceTransportAction.sendExecuteFetch(SearchServiceTransportAction.java:246)
at org.elasticsearch.action.search.type.TransportSearchQueryAndFetchAction$AsyncAction.sendExecuteFirstPhase(TransportSearchQueryAndFetchAction.java:75)
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.performFirstPhase(TransportSearchTypeAction.java:206)
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.performFirstPhase(TransportSearchTypeAction.java:193)
at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$2.run(TransportSearchTypeAction.java:179)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
at java.lang.Thread.run(Thread.java:722)
I am using elasticsearch 0.90.2 dependecy
[group: 'org.elasticsearch', name: 'elasticsearch', version: '0.90.2']
Same thing runs fine in real environment(snapshot below)
Is the problem with while loading query from file that caused it's malformation or what?
The exception basically means "There is no known query type called query". I'm guessing that your client library is automatically inserting the top-level query property, so your generated query actually looks like this:
{
"query" : {
"query" : {
"match_all" : {}
}
}
}
If your client can dump the JSON representation of the query, that can help a lot in debugging.
Try removing the query portion from your text file so that it is just the match_all query, see if that works for you.
your query string should be
String query = "{\"match_all\":{}}";
you can see from here
https://discuss.elastic.co/t/parsingexception-in-elastic-5-0-0/64626