I have following $let expression :
{
"$let": {
"vars": {
"h": {
"$hour": "$Date"
}
},
"in": {
"$cond": {
"if": {
"$lt": [
"$$h",
6
]
},
"then": "Night",
"else": {
"$cond": {
"if": {
"$lt": [
"$$h",
12
]
},
"then": "Morning",
"else": {
"$cond": {
"if": {
"$lt": [
"$$h",
18
]
},
"then": "Afternoon",
"else": "Evening"
}
}
}
}
}
}
}
}
I am unable to figure out any way to convert this shell operation to Java driver version. I tried MongoDB java driver docs and even Google but unable to find any way for it.
Any help on what could be the JAVA implementation for the same.
Thank you in advance!!!
Related
We changed a field to allow null and now previous json don't work anymore returning a AvroTypeException: Unknown union branch.
Here the previous (working) avro file and json for the test:
myobject.avsc
{
"namespace":"my.model.kafka.test",
"type":"record",
"name":"MyObject",
"fields":[
{
"name":"First_Level",
"type":[
"null",
{
"type":"record",
"name":"FirstLevel",
"fields":[
{
"name":"TheTimestamp",
"doc":"Timestamp",
"type":{
"type":"long",
"logicalType":"timestamp-micros"
}
},
{
"name":"CategoryCode",
"type":{
"type":"enum",
"name":"Code",
"symbols":[
"A",
"B"
]
}
},
{
"name":"SecondLevel",
"type":{
"type":"record",
"name":"SecondLevel",
"fields":[
{
"name":"ThirdLevel",
"type":{
"type":"array",
"items":[
{
"type":"record",
"name":"ThirdLevel",
"fields":[
{
"name":"LocationCode",
"type":"string"
},
{
"name":"SomeCode",
"type":"string"
},
{
"name":"Cost",
"type":"int"
}
]
}
]
}
}
]
}
},
{
"name":"UID",
"type":[
"null",
"string"
],
"default":null
}
]
}
],
"default":null
}
]
}
Here the json of the test:
{
"First_Level" : {
"my.model.kafka.test.FirstLevel" : {
"TheTimestamp" : 1648808100000000,
"CategoryCode" : "A",
"SecondLevel" : {
"ThirdLevel" : [ {
"my.model.kafka.test.ThirdLevel" : {
"LocationCode" : "BBB",
"SomeCode" : "AAA",
"Cost" : 2
}
}, {
"my.model.kafka.test.ThirdLevel" : {
"LocationCode" : "CCC",
"SomeCode" : "BBB",
"Cost" : 2
}
} ]
},
"UID" : "123-9jh789-opi8p83h3"
}
}
}
Modification to allow null
Here everything work fine, but if we make the SecondLevel nullable by changing the avsc file to the following we get the AvroTypeException: Unknown union branch:
{
"namespace":"my.model.kafka.test",
"type":"record",
"name":"MyObject",
"fields":[
{
"name":"First_Level",
"type":[
"null",
{
"type":"record",
"name":"FirstLevel",
"fields":[
{
"name":"TheTimestamp",
"doc":"Timestamp",
"type":{
"type":"long",
"logicalType":"timestamp-micros"
}
},
{
"name":"CategoryCode",
"type":{
"type":"enum",
"name":"Code",
"symbols":[
"A",
"B"
]
}
},
{
"name":"SecondLevel",
"type":[
"null",
{
"type":"record",
"name":"SecondLevel",
"fields":[
{
"name":"ThirdLevel",
"type":{
"type":"array",
"items":[
{
"type":"record",
"name":"ThirdLevel",
"fields":[
{
"name":"LocationCode",
"type":"string"
},
{
"name":"SomeCode",
"type":"string"
},
{
"name":"Cost",
"type":"int"
}
]
}
]
}
}
],
"default":null
}
]
},
{
"name":"UID",
"type":[
"null",
"string"
],
"default":null
}
]
}
],
"default":null
}
]
}
Which give a
org.apache.avro.AvroTypeException: Unknown union branch ThirdLevel
even if I change the json to include the namespace before the thirdlevel, like in the other stackoverflow answer I get the same error:
org.apache.avro.AvroTypeException: Unknown union branch my.model.kafka.test.ThirdLevel
My question is twofold:
How to modified the avsc so the old json will work and new json that may have the SecondLevel null work too?
We need to make this work but ultimately we need to be backward compatible too, so changing name or the json should be avoided.
EDIT:
After running the edited avsc vs kafka data directly the old message and new message were both working perfectly fine.
We have a process that save the message in a json files and the json from that process were the one with the problem.
Since the backward compatibility was needed only for the kafka consumer only, these change are actually fine.
For those who wonder here how the json should look like after adding the null type to SecondLevel:
{
"First_Level":{
"my.model.kafka.test.FirstLevel":{
"TheTimestamp":1648808100000000,
"CategoryCode":"A",
"SecondLevel":{
"my.model.kafka.test.SecondLevel":{
"ThirdLevel":[
{
"my.model.kafka.test.ThirdLevel":{
"LocationCode":"BBB",
"SomeCode":"AAA",
"Cost":2
}
},
{
"my.model.kafka.test.ThirdLevel":{
"LocationCode":"CCC",
"SomeCode":"BBB",
"Cost":2
}
}
]
}
},
"UID":"123-9jh789-opi8p83h3"
}
}
}
How to write Custom Query for Mongo DB to get the distinct data Need to write in Java but I need to check if is it possible with the query as well without using aggregation pipeline.
Sample Data:
[
{
"id":1,
"empName":"emp1",
"emp_city":"city1"
},
{
"id":2,
"empName":"emp2",
"emp_city":"city1"
},
{
"id":3,
"empName":"emp1",
"emp_city":"city1"
},
{
"id":4,
"empName":"emp1",
"emp_city":"city2"
}
]
Expected Output:
[
{
"empName":"emp1",
"emp_city":"city1"
},
{
"empName":"emp1",
"emp_city":"city2"
},
{
"empName":"emp2",
"emp_city":"city1"
}
]
For what you are trying to archive I would suggest using a group by, by the two fields (empName and emp_city),
Here you have and example https://sqlserverguides.com/mongodb-group-by-multiple-fields/
use this :
db.collection.aggregate([
{
$group: {
_id: {
empName: "$empName",
emp_city: "$emp_city"
}
}
},
{
"$replaceRoot": {
"newRoot": "$_id"
}
}
])
https://mongoplayground.net/p/d8i7iOuvfsR
I want to transform following input JSON to output JSON format
INPUT JSON:
[
{
"orderNumber": "201904-000000001",
"items": [
{
"itemPrice": 40000,
"itemQuantity": 11,
"item": {
"external_id": "IPHONE"
}
},
{
"itemPrice": 25000,
"itemQuantity": 22,
"item": {
"external_id": "ONEPLUS"
}
},
{
"itemPrice": 35000,
"itemQuantity": 33,
"item": {
"external_id": "SAMSUNGS10"
}
}
]
}
]
OUTPUT JSON:
[{
"orderNumber" : "201904-000000001",
"items" : [ {
"itemQuantity" : 11,
"external" : "IPHONE"
} ]
},
{
"orderNumber" : "201904-000000001",
"items" : [ {
"itemQuantity" : 22,
"external" : "ONEPLUS"
} ]
},
{
"orderNumber" : "201904-000000001",
"items" : [ {
"itemQuantity" : 33,
"external" : "SAMSUNGS10"
} ]
}]
I have tried following spec which is not working...could someone guide me about spec I should use and explain each step if possible if nested arrays and objects are even deeper how to convert
SPEC I HAVE USED:
[
{
"operation": "shift",
"spec": {
"*": {
"orderNumber": "[&1].orderNumber",
"items": {
"*": {
"itemQuantity": "[&1].items[].itemQuantity",
"item": {
"external_id": "[&1].items[].external"
}
}
}
}
}
}
]
Thanks guys, Following spec did work for me after trying different combinations. If anyone comes across this question please explain to me the answer for not trying combinations next time
[
{
"operation": "shift",
"spec": {
"*": {
"orderNumber": "[&1].orderNumber",
"items": {
"*": {
"itemQuantity": "[&3].items[&1].itemQuantity",
"item": {
"external_id": "[&4].items[&2].external"
}
}
}
}
}
}
]
I have this collection of documents:
[
{
"name": "name1",
"data": [
{
"numbers": ["1","2","3"]
}
]
},
{
"name": "name2",
"data": [
{
"numbers": ["2","5","3"]
}
]
},
{
"name": "name3",
"data": [
{
"numbers": ["1","5","2"]
}
]
},
{
"name": "name4",
"data": [
{
"numbers": ["1","4","3"]
}
]
},
{
"name": "name5",
"data": [
{
"numbers": ["1","2"]
}
]
}
]
I want to get all documents of this collection when an array passed as a parameter is a subset of data.numbers.
This is the aggregation that I'm using.
db.testing.aggregate(
[
{ "$match" : { "data.numbers" : { "$exists" : true } } },
{ "$project" : { "is_subset" : { "$filter" : { "input" : "$data", "as" : "d", "cond" : { "$setIsSubset" :[ ["1"],"$$d.numbers"] } } } } },
{ "$match" : { "is_subset.0" : { "$exists" : true } } }]
);
I'm trying to reproduce the above aggregation in Spring Data MongoDB.
How to pass an array as parameter in $filter and $setIsSubset functions?
operations.aggregate(
newAggregation(Testing.class,
match(where("data.numbers").exists(true)),
project().and(
filter("data")
.as("d")
.by(???))
.as("is_subset"),
match(where("is_subset.0").exists(true))
), Testing.class);
I solve my issue.
operations.aggregate(
newAggregation(Testing.class,
match(where("data.numbers").exists(true)),
project("id", "name").and(
filter("data")
.as("d")
.by(context -> new Document("$setIsSubset", Arrays.asList(numbers, "$$d.numbers"))))
.as("is_subset"),
match(where("is_subset.0").exists(true))
), Testing.class);
I created a Document with the content that I needed in the $filter condition.
new Document("$setIsSubset", Arrays.asList(numbers, "$$d.numbers"))
I have the following data structure
[{
"id": "1c7bbebd-bc3d-4352-9ac0-98c01d13189d",
"version": 0,
"groups": [
{
"internalName": "Admin group",
"fields": [
{
"internalName": "Is verified",
"uiProperties": {
"isShow": true
}
},
{
"internalName": "Hide",
"uiProperties": {
"isHide": false
}
},
...
]
},
...
]
},
{
"id": "2b7bbebd-bc3d-4352-9ac0-98c01d13189d",
"version": 0,
"groups": [
{
"internalName": "User group",
"fields": [
{
"internalName": "Is verified",
"uiProperties": {
"isShow": true
}
},
{
"internalName": "Blocked",
"uiProperties": {
"isBlocked": true
}
},
...
]
},
...
]
},
...
]
Internal names of the fields can be repeated. I want to group by group.field.internalName and cut the array(for pagination) and get the output like:
{
"totalCount": 3,
"items": [
{
"internalName": "Blocked"
},
{
"internalName": "Hide"
},
{
"internalName": "Is verified"
}
]}
I wrote a query that works,
db.layouts.aggregate(
{
$unwind : "$groups"
},
{
$unwind : "$groups.fields"
},
{
$group: {
"_id" : {
"internalName" : "$groups.fields.internalName",
},
"internalName" : {
$first : "$groups.fields.internalName"
}
}
},
{
$group: {
"_id" : null,
"items" : {
$push : "$$ROOT"
},
"totalCount" : {
$sum : 1
}
}
},
{
$project: {
"items" : {
$slice : [ "$items", 0, 20 ]
},
"totalCount": 1
}
})
but I have the problem of translating it to java api. Notice that i need to use mongoTemplate approach. Here is what i have and where i'm struck
final List<AggregationOperation> aggregationOperations = new ArrayList<>();
aggregationOperations.add(unwind("groups"));
aggregationOperations.add(unwind("groups.fields"));
aggregationOperations.add(
group("groups.fields.internalName")
.first("groups.fields.internalName").as("internalName")
);
aggregationOperations.add(
group()
.push("$$ROOT").as("fields")
.sum("1").as("totalCount") // ERROR only string ref can be placed, but i need a number?
);
aggregationOperations.add(
project()
.andInclude("totalCount")
.and("fields").slice(size, page * size)
);
final Aggregation aggregation = newAggregation(aggregationOperations);
mongoTemplate.aggregate(aggregation, LAYOUTS, FieldLites.class).getMappedResults()
With this query i have the problem with sum(), because i can place only a String ref by api(but need a number) and with project operation - got an exception
java.lang.IllegalArgumentException: Invalid reference 'totalCount'!] with root cause
Can you help me with this query translation?
You can use count
group()
.push("$$ROOT").as("fields")
.count().as("totalCount")