Query annotation not working with and / or operators - java

I'm trying to create custom MongoDB queries in Java with Spring
A simple query, like this, works fine
#Query("{'status' : ?0}")
Page<Project> filterProjects(String status, Pageable pageable);
But when I try a more complicated query with $and and $or, I don't get back any results
#Query(value = "{ '$and' : [ { '$or' : [ { 'project_title': {$regex:?0,$options:'i'}}, { 'project_description': {$regex:?0,$options:'i'}}, { 'project_short_name': {$regex:?0,$options:'i'}}]}, { 'status' : ?1}, { 'assignee' : ?2} ]}")
Page<Project> filterProjects(String search, String status, String assignee, Pageable pageable);
The raw mongodb query version of the above works fine
db.project.find( {
$and: [
{ $or: [ { 'project_title': {$regex: <search> ,$options:'i'}}, { 'project_description': {$regex: <search>,$options:'i'}}, { 'project_short_name': {$regex:<search>,$options:'i'}}]},
{ 'status' : <status>},
{ 'assignee' : <assignee>}
]
} )
Is there something wrong with the query in #Query or are these operations not supported at all, in #Query?

have you tried out removing the single quotes ('') around $and/$or in the value of your #Query annotation ?

Related

Mongo Projection is not working for SubFields

I have a json object something like this. I want to exclude the field "placeOfBirth" from the response. For that I am using projection. But somehow it is working only for fileds but not for the subfields. So placeofBirth is never excluded but status is removed in the response.
Here is my code
Projection projectionExclude = Projection.of().exclude("subObject.placeOfBirth").exclude("status");
MorphiaCursor<T> cursor = datastore.aggregate(T.class)
.match(Filters.eq("about", id))
.project(projectionExclude).execute(T.class);
if(cursor != null && cursor.hasNext()){
result = cursor.toList().get(0);
}
Json data
{
"about: " "testing/123",
"subObject" : [
{
"about" : "subobject/123",
"placeOfBirth": {
"birth": ["Lisbon"]
}
}
],
"status" : "approved"
}
How can make this work? Is there some other way to achieve this?
This actually works for me on 2.2. Here's the test I'm running:
MongoCollection<Document> collection = getDocumentCollection(User.class);
collection.insertOne(parse("{'about': 'testing/123', 'subObject' : [ {'about' : 'subobject/123', 'placeOfBirth': {'birth': " +
"['Lisbon']}}],'status' : 'approved'}"));
Document next = getDs().aggregate(User.class)
.match(eq("about", "testing/123"))
.project(project()
.exclude("subObject.placeOfBirth")
.exclude("status"))
.execute(Document.class)
.next();
assertFalse(next.toJson().contains("placeOfBirth"));
assertFalse(next.toJson().contains("status"));

Update string field concatenating another string in mongodb with spring-data

How I can update a string field in a mongo document, concatenating another string value, using java and spring-data mongo? Ex:
{
“languages”: “python,java,c”
}
Concat “kotlin”:
{
“languages”: “python,java,c,kotlin”
}
Thanks so much.
Starting in MongoDB v4.2, the db.collection.update() method can accept an aggregation pipeline to modify a field using the values of the other fields in the Document.
Update with Aggregation Pipeline
Try this one:
UpdateResult result = mongoTemplate.updateMulti(Query.query(new Criteria()),
AggregationUpdate.update()
.set(SetOperation.set("languages")
.toValue(StringOperators.Concat.valueOf("languages").concat(",").concat("kotlin"))),
"collection"); //mongoTemplate.getCollectionName(Entity.class)
System.out.println(result);
//AcknowledgedUpdateResult{matchedCount=1, modifiedCount=1, upsertedId=null}
In MongoDB shell, it looks like this:
db.collection.updateMany({},
[
{ "$set" : { "languages" : { "$concat" : ["$languages", ",", "kotlin"]}}}
]
)

Parse Exception in ElasticSearch

I'm learning about elastic search and I am trying to retrieve data based on a field value in the table.
I have the table (MySQL) "code" which has a field "code_group_id" and existing data in the table.
Using Typescript and Java I would like to retrieve a List of Code objects with a specific code_group_id. I have prepared the following methods in Java:
#GetMapping("/_search/codes")
#Timed
public ResponseEntity<List<CodeDTO>> searchCodes(#RequestParam String query, Pageable pageable) {
log.debug("REST request to search for a page of Codes for query {}", query);
Page<CodeDTO> page = codeService.search(query, pageable);
HttpHeaders headers = PaginationUtil.generateSearchPaginationHttpHeaders(query, page, "/api/_search/codes");
return new ResponseEntity<>(page.getContent(), headers, HttpStatus.OK);
}
#GetMapping("/codes/currencies")
#Timed
public ResponseEntity<List<CodeDTO>> getAllByCodeGroupId(Pageable pageable) {
QueryBuilder qb = QueryBuilders.termQuery("codeGroupId", 3);
return searchCodes(qb.toString(), pageable);
}
According to the ES documentation, the terms query should be the correct choice here as I am looking for a specific query term, so this is supposed to return a response body containing all "code" records that have code_group_id = 3.
However, when I test the GET command on the REST API I get the following exception:
2018-04-21 21:32:47.024 ERROR 14961 --- [ XNIO-59 task-5]
c.i.s.aop.logging.LoggingAspect : Exception in ch.ice.swingkasso.service.impl.CodeServiceImpl.search() with cause = '[code] QueryParsingException[Failed to parse query [{
"term" : {
"codeGroupId" : 3
}
}]]; nested: ParseException[Cannot parse '{
"term" : {
"codeGroupId" : 3
}
}': Encountered " <RANGE_GOOP> "{\n "" at line 1, column 13.
Was expecting one of:
"]" ...
"}" ...
]; nested: ParseException[Encountered " <RANGE_GOOP> "{\n "" at line 1, column 13.
Was expecting one of:
"]" ...
"}" ...
];' and exception = 'all shards failed'
Caused by: org.elasticsearch.index.query.QueryParsingException: Failed to parse query [{
"term" : {
"codeGroupId" : 3
}
}]
Am I overlooking something simple? Thanks for any pointer in this matter.
Found the solution. The problem was that the search method re-converted the query into a QueryStringQuery hence the parse Error.

Fuzzy query on dates with ElasticSearch Java API

I'm trying to perform the following query through ElasticSearch Java API
{
"query" : {
"fuzzy" : {
"dateOfBirth" : {
"value" : "1944-11-30",
"fuzziness" : "365d"
}
}
}
}
I doesn't understand how to specify the fuzzines value for 365 days in this kind of query.
You can use this:
FuzzyQueryBuilder queryBuilder = fuzzyQuery("dateOfBirth" ,"1944-11-30" ).fuzziness(Fuzziness.build("365d"))
Hope this helps

How to enable query logging in Spring-data-elasticsearch

I use spring-data-elasticsearch framework to get query result from elasticsearch server, the java code like this:
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(matchAllQuery()).withSearchType(SearchType.COUNT)
.addAggregation(new MinBuilder("min_createDate").field("createDate"))
.build();
List<Entity> list = template.queryForList(searchQuery, Entity.class);
While how can I know the raw http query sent to elasticssearch server?
How can I enable the logging, I tried add log4j, but it seems the spring-data-elasticsearch doesn't log the query.
After digging through the spring data code i found this helpful little logger called "tracer" (name not very unique)
By setting the following in application.properties
logging.level.tracer=TRACE
It will print out a full curl statement for the request along with full JSON the response from Elasticsearch.
This one is quite old, but I'd still like to share the solution that worked for me. To log Spring Data Elasticsearch queries executed through the Repository, you need to enable DEBUG logging for the package org.springframework.data.elasticsearch.core.*, e.g. as follows:
logging:
level:
org:
springframework:
data:
elasticsearch:
core: DEBUG
After that, queries will appear in logs:
{
"from" : 0,
"size" : 1,
"query" : {
"bool" : {
"should" : [ {
"query_string" : {
"query" : "John Doe",
"fields" : [ "entityName" ],
"default_operator" : "and"
}
}, {
"query_string" : {
"query" : "John Doe",
"fields" : [ "alias" ],
"default_operator" : "and"
}
} ]
}
},
"post_filter" : {
"bool" : { }
}
}
One would expect an elegant solution similar to JPA, but it seems that it doesn't simply exist.
Tested with Spring Boot 1.4.0 and Spring Data Elasticsearch 1.7.3.
If you are using spring boot you can set the following in your application.properties:
logging.level.org.elasticsearch.index.search.slowlog.query=INFO
spring.data.elasticsearch.properties.index.search.slowlog.threshold.query.info=1ms
I don't have an answer for Spring Data Elasticsearch, but in ES itself you can bump up the default settings for slow query logging and see all the queries in the slow log. More details about slow log here.
As to how to change the thresholds, a command like this should be used:
PUT /_settings
{
"index.search.slowlog.threshold.query.info": "1ms"
}
1ms is kindof the smallest value you can set.
This works on Spring Boot 2.3.3.RELEASE
logging.level.org.springframework.data.elasticsearch.client.WIRE=trace
I encountered the same problem, In the ElasticsearchTemplate only a few method have log debug level, E.g:
public <T> Page<T> queryForPage(CriteriaQuery criteriaQuery, Class<T> clazz) {
QueryBuilder elasticsearchQuery = new CriteriaQueryProcessor().createQueryFromCriteria(criteriaQuery.getCriteria());
QueryBuilder elasticsearchFilter = new CriteriaFilterProcessor().createFilterFromCriteria(criteriaQuery.getCriteria());
SearchRequestBuilder searchRequestBuilder = prepareSearch(criteriaQuery, clazz);
if (elasticsearchQuery != null) {
searchRequestBuilder.setQuery(elasticsearchQuery);
} else {
searchRequestBuilder.setQuery(QueryBuilders.matchAllQuery());
}
if (criteriaQuery.getMinScore() > 0) {
searchRequestBuilder.setMinScore(criteriaQuery.getMinScore());
}
if (elasticsearchFilter != null)
searchRequestBuilder.setPostFilter(elasticsearchFilter);
if (logger.isDebugEnabled()) {
logger.debug("doSearch query:\n" + searchRequestBuilder.toString());
}
SearchResponse response = getSearchResponse(searchRequestBuilder
.execute());
return resultsMapper.mapResults(response, clazz, criteriaQuery.getPageable());
}
#2280258 is correct, and here comes the official doc:
https://docs.spring.io/spring-data/elasticsearch/docs/current/reference/html/index.html#elasticsearch.clients.logging
<logger name="org.springframework.data.elasticsearch.client.WIRE" level="trace"/>
Here is the reason: in org.springframework.data.elasticsearch.client.ClientLogger, spring data elasticsearch creates a logger named "org.springframework.data.elasticsearch.client.WIRE":
private static final Logger WIRE_LOGGER = LoggerFactory
.getLogger("org.springframework.data.elasticsearch.client.WIRE");
Just to add my two cents to #AndreiStefan: Now you can set 0ms instead of 1ms. It seems that some very fast queries can be captured using this method.
Simply do:
PUT /_settings
{
"index.search.slowlog.threshold.query.info": "0ms"
}

Categories

Resources