The Regex works in java but is not woked in ElasticSearch.
Java:
Pattern pattern = Pattern.compile("(\\d{8}-[01],)*(((202210((2[89])|(3[01])))|(2022((1[12]))\\d{2})|(20((2[3-9])|([3-9][0-9]))\\d{4}))-[01])*([,]\\d{8}-[01])*");
Matcher matcher = pattern.matcher("20221027-0,20221028-1");
System.out.println(matcher.matches());
It prints true
But when I using EleasticSearch, it was not woked.
The folloing json is the document what I want to query in EleasticSearch.
{
"_index": "eagle_clue_v1",
"_type": "_doc",
"_id": "51740",
"_score": 0.0,
"_source": {
"id": 51740,
"next_follow_time": "20221027-0,20221028-1"
}
}
The following query was not worked
POST /eagle_clue_v1/_search
{
"from": 0,
"size": 10,
"query": {
"bool": {
"must": [
{
"bool": {
"filter": [
{
"terms": {
"id": [
"51740"
]
}
},
{
"regexp": {
"next_follow_time.keyword": {
"value": "(\\d{8}-[01],)*(((202210((2[89])|(3[01])))|(2022((1[12]))\\d{2})|(20((2[3-9])|([3-9][0-9]))\\d{4}))-[01])([,]\\d{8}-[01])*"
}
}
}
]
}
}
]
}
}
}
Check this page for regular expression syntax.
use [0-9] instead of \d.
{
"from": 0,
"size": 10,
"query": {
"bool": {
"must": [
{
"bool": {
"filter": [
{
"regexp": {
"next_follow_time.keyword": {
"value": """([0-9]{8}-[01],)*(((202210((2[89])|(3[01])))|(2022((1[12]))[0-9]{2})|(20((2[3-9])|([3-9][0-9]))[0-9]]{4}))-[01])([,][0-9]{8}-[01])*"""
}
}
}
]
}
}
]
}
}
}
I'm trying to fetch users from ES based on the status of some of the fields.
I have 5 fields whose status I want to check and if any of these fields have the failed status I want to fetch that record. Since it's an OR condition between these 5 fields I was trying to use should in ES and adding terms to it. But it returns records of those users who don't match the criteria as well.
{
"from": 0,
"size": 50,
"query": {
"bool": {
"must": [
{
"nested": {
"query": {
"bool": {
"must": [
{
"range": {
"segment_status.updated_at": {
"from": "2021-01-24",
"to": null,
"include_lower": true,
"include_upper": true,
"boost": 1
}
}
}
],
"should": [
{
"terms": {
"segment_status.bse_status": [
2,
3,
4
],
"boost": 1
}
},
{
"terms": {
"segment_status.nse_status": [
2,
3
],
"boost": 1
}
}
],
"adjust_pure_negative": true,
"boost": 1
}
},
"path": "segment_status",
"ignore_unmapped": false,
"score_mode": "avg",
"boost": 1
}
}
],
"must_not": [
{
"term": {
"marked_failed_manually": {
"value": true,
"boost": 1
}
}
}
],
"adjust_pure_negative": true,
"boost": 1
}
},
"sort": [
{
"segment_status.updated_at": {
"order": "asc",
"mode": "min",
"nested_filter": {
"term": {
"segment_status.segment_type": {
"value": "CASH",
"boost": 1
}
}
},
"nested_path": "segment_status"
}
}
]
}
That is the query generated by the code. I'm using spring boot to build the query.
Just for reference, I tried this query and it seems to work.
{
"from": 0,
"size": 50,
"query": {
"bool": {
"must": [
{
"nested":{
"query":{
"bool":{
"must" : [
{
"range" : {
"segment_status.updated_at" : {
"from" : "2021-08-30",
"to" : null,
"include_lower" : true,
"include_upper" : true,
"boost" : 1.0
}
}
}
]
}
},
"path" : "segment_status",
"ignore_unmapped" : false,
"score_mode" : "avg",
"boost" : 1.0
}
},
{
"nested": {
"query": {
"bool": {
"should": [
{
"terms": {
"segment_status.bse_status": [
2,
3
],
"boost": 1
}
}
],
"adjust_pure_negative": true,
"boost": 1
}
},
"path": "segment_status",
"ignore_unmapped": false,
"score_mode": "avg",
"boost": 1
}
}
],
"must_not": [
{
"term": {
"marked_failed_manually": {
"value": true,
"boost": 1
}
}
}
],
"adjust_pure_negative": true,
"boost": 1
}
},
"sort": [
{
"segment_status.updated_at": {
"order": "asc",
"mode": "min",
"nested_filter": {
"term": {
"segment_status.segment_type": {
"value": "CASH",
"boost": 1
}
}
},
"nested_path": "segment_status"
}
}
]
}
I am trying to generate query dynamically based on the inputs but in the generated query i can see there are only two aggregations are getting generated how can i make each fields to have the separate aggregations below is the code what i have tried and the response what i'm getting.
From main() i'm calling
buildSearchCriteria("1");
Here i am setting the aggregation type and respective values:
public static void buildSearchCriteria(String... exceptionId) {
SearchCriteria searchCriteria = new SearchCriteria();
Map<String, List<FieldNameAndPath>> stringListMap = new HashMap<>();
stringListMap.put("nested", asList(new FieldNameAndPath("nested", "recommendations",
"recommendations", null, emptyList(), 1)));
stringListMap.put("filter", asList(new FieldNameAndPath("filter", "exceptionIds", "recommendations.exceptionId.keyword",
asList(exceptionId),
asList(new NestedAggsFields("terms", "exceptionIdsMatch")), 2)));
stringListMap.put("terms", asList(new FieldNameAndPath("terms", "by_exceptionId", "recommendations.exceptionId.keyword", null, emptyList(), 3),
new FieldNameAndPath("terms", "by_item", "recommendations.item.keyword", null, emptyList(), 4),
new FieldNameAndPath("terms", "by_destination", "recommendations.location.keyword", null, emptyList(), 5),
new FieldNameAndPath("terms", "by_trans", "recommendations.transportMode.keyword", null, emptyList(), 6),
new FieldNameAndPath("terms", "by_sourcelocation", "recommendations.sourceLocation.keyword", null, emptyList(), 7),
new FieldNameAndPath("terms", "by_shipdate", "recommendations.shipDate", null, emptyList(), 8),
new FieldNameAndPath("terms", "by_arrival", "recommendations.arrivalDate", null, emptyList(), 9)));
stringListMap.put("sum", asList(new FieldNameAndPath("sum", "quantity", "recommendations.transferQuantity", null, emptyList(), 10),
new FieldNameAndPath("sum", "transfercost", "recommendations.transferCost", null, emptyList(), 11),
new FieldNameAndPath("sum", "revenueRecovered", "recommendations.revenueRecovered", null, emptyList(), 12)));
System.out.println(stringListMap);
searchCriteria.setStringListMap(stringListMap);
aggregate(searchCriteria);
}
Below is the aggregate function which will get the the above information and builds query:
public static void aggregate(SearchCriteria searchCriteria) throws IOException {
Map<String, List<FieldNameAndPath>> map = searchCriteria.getStringListMap();
List<FieldNameAndPath> nesteds = map.get("nested");
List<FieldNameAndPath> filter = map.get("filter");
List<FieldNameAndPath> terms = map.get("terms");
List<FieldNameAndPath> sums = map.get("sum");
SearchSourceBuilder sourceBuilder = new SearchSourceBuilder();
AggregationBuilder aggregationBuilder = new SamplerAggregationBuilder("parent");
nesteds.stream().forEach(l -> buildAggregations(l, aggregationBuilder));
filter.stream().forEach(l -> buildAggregations(l, aggregationBuilder));
terms.stream().forEach(l -> buildAggregations(l, aggregationBuilder));
sums.stream().forEach(l -> buildAggregations(l, aggregationBuilder));
SearchRequest searchRequest = new SearchRequest();
searchRequest.indices("index");
searchRequest.types("type");
sourceBuilder.aggregation(aggregationBuilder);
searchRequest.source(sourceBuilder);
System.out.println(searchRequest.source().toString());
}
buildAggregations method:
private static AggregationBuilder buildAggregations(FieldNameAndPath fieldNameAndPath , AggregationBuilder parentAggregationBuilder) {
if(fieldNameAndPath.getAggType().equals("nested")){
parentAggregationBuilder = AggregationBuilders.nested(fieldNameAndPath.getFieldName(), fieldNameAndPath.getFieldPath());
}
if(fieldNameAndPath.getAggType().equals("filter")){
parentAggregationBuilder.subAggregation(AggregationBuilders
.filter(fieldNameAndPath.getFieldName(),
QueryBuilders.termsQuery(fieldNameAndPath.getNestedAggs()
.stream().map(nestedAggsFields -> nestedAggsFields.getFieldName()).findFirst().get(), fieldNameAndPath.getFieldValues())));
}
if(fieldNameAndPath.getAggType().equals("terms")){
parentAggregationBuilder.subAggregation(AggregationBuilders.terms(fieldNameAndPath.getFieldName())
.field(fieldNameAndPath.getFieldPath()));
}
if(fieldNameAndPath.getAggType().equals("sum")){
parentAggregationBuilder.subAggregation(AggregationBuilders.
sum(fieldNameAndPath.getFieldName()).field(fieldNameAndPath.getFieldPath()));
}
return parentAggregationBuilder;
}
SearchCriteria class:
#Data
public class SearchCriteria {
Map<String, List<FieldNameAndPath>> stringListMap;
private List<String> searchFields;
}
And the DTO FieldNameAndPath:
public class FieldNameAndPath{
private String aggType;
private String fieldName;
private String fieldPath;
private List<String> fieldValues;
private List<NestedAggsFields> nestedAggs;
private int order;
}
And the query output from the above code is:
{
"aggregations": {
"parent": {
"sampler": {
"shard_size": 100
},
"aggregations": {
"exceptionIds": {
"filter": {
"terms": {
"exceptionIdsMatch": [
"1"
],
"boost": 1
}
}
},
"by_exceptionId": {
"terms": {
"field": "recommendations.exceptionId.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
}
},
"by_item": {
"terms": {
"field": "recommendations.item.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
}
},
"by_destination": {
"terms": {
"field": "recommendations.location.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
}
},
"by_trans": {
"terms": {
"field": "recommendations.transportMode.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
}
},
"by_sourcelocation": {
"terms": {
"field": "recommendations.sourceLocation.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
}
},
"by_shipdate": {
"terms": {
"field": "recommendations.shipDate",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
}
},
"by_arrival": {
"terms": {
"field": "recommendations.arrivalDate",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
}
},
"quantity": {
"sum": {
"field": "recommendations.transferQuantity"
}
},
"transfercost": {
"sum": {
"field": "recommendations.transferCost"
}
},
"revenueRecovered": {
"sum": {
"field": "recommendations.revenueRecovered"
}
}
}
}
}
}
Expected Query is:
{
"size": 0,
"aggregations": {
"exceptionIds": {
"nested": {
"path": "recommendations"
},
"aggregations": {
"exceptionIdsMatch": {
"filter": {
"terms": {
"recommendations.exceptionId.keyword": [
"1"
],
"boost": 1
}
},
"aggregations": {
"by_exceptionId": {
"terms": {
"field": "recommendations.exceptionId.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
},
"aggregations": {
"by_item": {
"terms": {
"field": "recommendations.item.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
},
"aggregations": {
"by_destination": {
"terms": {
"field": "recommendations.location.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
},
"aggregations": {
"by_trans": {
"terms": {
"field": "recommendations.transportMode.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
},
"aggregations": {
"by_sourcelocation": {
"terms": {
"field": "recommendations.sourceLocation.keyword",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
},
"aggregations": {
"by_shipdate": {
"terms": {
"field": "recommendations.shipDate",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
},
"aggregations": {
"by_arrival": {
"terms": {
"field": "recommendations.arrivalDate",
"size": 10,
"min_doc_count": 1,
"shard_min_doc_count": 0,
"show_term_doc_count_error": false,
"order": [
{
"_count": "desc"
},
{
"_key": "asc"
}
]
},
"aggregations": {
"quantity": {
"sum": {
"field": "recommendations.transferQuantity"
}
},
"transfercost": {
"sum": {
"field": "recommendations.transferCost"
}
},
"revenueRecovered": {
"sum": {
"field": "recommendations.revenueRecovered"
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
}
I am a novice in Java and I am looking for a way to flatten json documents.
I have tried Object mapper but without success and I have also tried to do with json node but still get no success .
I found this link but the results is not what I need :https://github.com/wnameless/json-flattener
I have also been helped before but the example was too specific and I cannot do the same things because the documents is too long this is why I am looking for a way to make it generic: Flatten json documents in Java
I need to transform "any" json documents like in the example below :
Here is an example of my documents
Documents recieved:
{
"took": 7,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 10,
"max_score": 0,
"hits": []
},
"aggregations": {
"groupe": {
"doc_count_error_upper_bound": 0,
"sum_other_doc_count": 0,
"buckets": [
{
"key": "a",
"doc_count": 1,
"date": {
"buckets": [
{
"key_as_string": "2017-05-03T00:00:00.000Z",
"key": 1493769600000,
"doc_count": 1,
"value": {
"value": 1
}
},
{
"key_as_string": "2017-05-03T01:00:00.000Z",
"key": 1493776800000,
"doc_count": 1,
"value": {
"value": 3
}
}
]
}
},
{
"key": "b",
"doc_count": 4,
"date": {
"buckets": [
{
"key_as_string": "2017-05-03T00:00:00.000Z",
"key": 1493769600000,
"doc_count": 1,
"value": {
"value": 4
}
},
{
"key_as_string": "2017-05-03T01:00:00.000Z",
"key": 1493773200000,
"doc_count": 1,
"value": {
"value": 3
}
}
]
}
}
]
}
}
}
Document Transformed:
{
"took": 7,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 10,
"max_score": 0,
"hits": []
},
"aggregations": {
"groupe": {
"doc_count_error_upper_bound": 0,
"sum_other_doc_count": 0,
"buckets": [
{
"key": "a",
"doc_count": 1,
"date": {
"buckets": [
{
"key_as_string": "2017-05-03T00:00:00.000Z",
"key": 1493769600000,
"doc_count": 1,
"value": {
"value": 1
}
}
]
}
},
{
"key": "a",
"doc_count": 1,
"date": {
"buckets": [
{
"key_as_string": "2017-05-03T02:00:00.000Z",
"key": 1493776800000,
"doc_count": 1,
"value": {
"value": 3
}
}
]
}
},
{
"key": "b",
"doc_count": 1,
"date": {
"buckets": [
{
"key_as_string": "2017-05-03T02:00:00.000Z",
"key": 1493776800000,
"doc_count": 1,
"value": {
"value": 4
}
}
]
}
},
"key": "b",
"doc_count": 1,
"date": {
"buckets": [
{
"key_as_string": "2017-05-03T02:00:00.000Z",
"key": 1493776800000,
"doc_count": 1,
"value": {
"value": 4
}
}
]
}
}
]
}
}
}
I use
ElasticSearchTemplate().queryForPage(SearchQuery, CLASS)
How can I print the full json request?
I manage to print only filter by doing :
searchQuery.getFilter().toString()
But cant manage to do the same with:
searchQuery.getAggregations().toString();
I would like to print in console something like :
"aggs": {
"agg1": {
"terms": {
"field": "basket_id_1",
"size": 0
},
"aggs": {
"basket_id_2": {
"terms": {
"field": "basket_id_2",
"size": 0
},
"aggs": {
"basket_id_3": {
"terms": {
"field": "basket_id_3",
"size": 0
}
}
}
}
}
}
}
This is what I've started using to do the same thing.
{
"top_agg": {
"terms": {
"field": "id",
"size": 100
},
"aggregations": {
"parent": {
"nested": {
"path": "transactions"
},
"aggregations": {
"totals": {
"filter": {
"terms": {
"transactions.type": [
"ttype"
]
}
},
"total_events": {
"cardinality": {
"field": "parent.field"
}
}
}
}
}
}
}
}
NativeSearchQuery query = queryBuilder.build();
if (query.getQuery() != null) {
log.debug(query.getQuery().toString());
}
if (query.getAggregations() != null) {
try {
XContentBuilder builder = XContentFactory.contentBuilder(XContentType.JSON);
builder.startObject();
for (AbstractAggregationBuilder subAgg : query.getAggregations()) {
subAgg.toXContent(builder, ToXContent.EMPTY_PARAMS);
}
builder.endObject();
log.debug(builder.string());
} catch (IOException e) {
log.debug("Error parsing aggs");
}
}
Could you use the SearchResponse.getAggregations().asList() ?