I am getting the response in java using the following code
List<Dto> Dtos = myConsignmentDao.fetchMyData(myConsignmentRequest);
if(!Dtos.isEmpty()){
List<MyConsignmentData> myDetails = Dtos.stream()
.collect(Collectors.groupingBy(Dto::getConsignmentNumber))
.entrySet().stream()
.map(myConsData -> getMyData(myConsData))
.collect(Collectors.toList());
myResponse.setMyConsignments(myDetails);
myResponse.setMyConsignments(myDetails);
return myResponse;
}
}
And My response is
{
"myNumber": [
{
"Number": "12345",
"consignmentItems": [
{
"ItemNumber": "678954",
"deliveryDate": "2021-01-05 09:09:53+00"
}
],
"emailAddress": "myresult#gmail.com",
"mobileNumber": "+91377383",
"partyType": "Receiver",
"creationDate": "2020-12-29"
}
I have written code using flux
Flux<MtDto> myDtos = myDao.fetchMyData(myConsignmentRequest);
myDtos.groupBy(MyConsignmentsDto::getConsignmentNumber).collectList();
Because of this I am getting only root object fields i.e myNumber, mobileNumber, email address not itemNumber and delivery date can someone help me write code using webflux.
Related
I am slightly confused to get the proper response when the user tries to use two rest endpoints with WebClient. I want to use it as asynchronous and non-blocking in the code.
I am returning the Flux from the controller. The code details are as below:
The Controller Class method looks like this:
#RequestMapping(method = RequestMethod.GET, path = "/v1/repos/{userName}", produces = "application/json")
public ResponseEntity<Flux<GithubRepo>> getUserReposDetails(
#PathVariable(name="userName",required = true) String userName) throws Exception{
return new ResponseEntity<>(this.getGitRepos(userName), HttpStatus.OK);
}
It is calling getGitRepos method. The method details are as below:
private Flux<GithubRepo> getGitRepos(String userName) {
return webClient.get().uri("/users/{userName}/repos",userName).
exchangeToFlux(clientResponse -> clientResponse.bodyToFlux(GithubRepo.class)).map(github->{
github.setBranch(webClient.get()
.uri("/repos/{userName}/{branchName}/branches",userName,github.getBranch())
.retrieve().bodyToFlux(Branch.class).collectList());
return github;
});
}
And WebClient is:
WebClient webClient = WebClient.builder().baseUrl("https://api.github.com").
defaultHeader(HttpHeaders.CONTENT_TYPE, "application/vnd.github.v3+json").build();
The GitHub and Branch Classes are below:
#Data
public class GithubRepo {
private String name;
private String ownerLogin;
private Mono<List<Branch>> branch;
#JsonProperty("owner")
private void unpackNested(Map<String,String> commit) {
this.ownerLogin = commit.get("login");
}
}
#Data
public class Branch {
private String name;
private Boolean protected;
}
I am getting the JSON response as:
[
{
"name": "HelloWorld",
"ownerLogin": "test",
"branch": {
"scanAvailable": true
}
},
{
"name": "rokehan",
"ownerLogin": "test",
"branch": {
"scanAvailable": true
}
},
{
"name": "sNews",
"ownerLogin": "test",
"branch": {
"scanAvailable": true
}
},
{
"name": "Test--01",
"ownerLogin": "test",
"branch": {
"scanAvailable": true
}
}
]
I want to get the response as:
[
{
"name": "HelloWorld",
"ownerLogin": "test",
"branch": [
{
"name": "asd",
"protected": false
},
{
"name": "master",
"protected": false
}
]
},
{
"name": "rokehan",
"ownerLogin": "test",
"branch": [
{
"name": "master",
"protected": false
}
]
},
{
"name": "sNews",
"ownerLogin": "test",
"branch": []
},
{
"name": "Test--01",
"ownerLogin": "test",
"branch": [
{
"name": "master",
"protected": false
}
]
}
]
Please help me to resolve this problem.
I am not sitting by a computer so i cant check against a compiler (im writing this on mobile)
But the problem is that you are trying to serialize a Mono<List<T>> which means you are trying to send something that might not have been resolved yet. You need to return a List<T>.
private Flux<MyResponse> getGitRepos(String userName) {
return webClient.get()
.uri("/users/{userName}/repos", userName)
.exchangeToFlux(clientResponse -> clientResponse.bodyToFlux(GithubRepo.class))
.flatMap(github -> {
return webClient.get()
.uri("/repos/{userName}/{branchName}/branches", userName, github.getBranch())
.retrieve()
.bodyToFlux(Branch.class)
.collectList())
.flatMap(branches -> {
return Mono.just(new MyResponse(github, branches));
});
}
I wrote this free hand but you should get the point. Have one class against the github api, and another against your api so you can alter your api freely if the api against github changes.
Thanks, #Toerktumlare for helping me. The updated method is:
private Flux<MyResponse> getGitRepos(String userName) {
return webClient.get()
.uri("/users/{userName}/repos", userName)
.exchangeToFlux(clientResponse -> clientResponse.bodyToFlux(GithubRepo.class))
.flatMap(github -> {
return webClient.get()
.uri("/repos/{userName}/{branchName}/branches", userName, github.getName())
.retrieve()
.bodyToFlux(Branch.class)
.collectList()
.flatMap(branches -> {
return Mono.just(new MyResponse(github, branches));
});
});
}
I'm trying to create a class which will write automatically to ElasticSearch through the Rest High Level Client with the operations (create, createBatch, remove, removeBatch, update, updateBatch) and those operations all work and my test cases all succeed. To add a bit more flexibility, I wanted to implement the following method: (find, findAll, getFirsts(n), getLasts(n)). find(key) and findAll() both work perfectly fine but getFirsts(n) and getLasts(n) don't at all.
Here is the context:
Before each test case -> Ensure that index "test" exists and create it if it doesn't
After each test case -> Delete index "test"
For getFirsts(n) and getLasts(n) I call create to have a few items in ElasticSearch and then search according to the uniqueKey.
Here is the mapping for my Test Object:
{
"properties": {
"date": { "type": "long" },
"name": { "type": "text" },
"age": { "type": "integer" },
"uniqueKey": { "type": "keyword" }
}
}
Here is my test case:
#Test
public void testGetFirstByIds() throws BeanPersistenceException {
List<StringTestDataBean> beans = new ArrayList<>();
StringTestDataBean bean1 = new StringTestDataBean();
bean1.setName("Tester");
bean1.setAge(22);
bean1.setTimeStamp(23213987321712L);
beans.add(elasticSearchService.create(bean1));
StringTestDataBean bean2 = new StringTestDataBean();
bean1.setName("Antonio");
bean1.setAge(27);
bean1.setTimeStamp(2332321117321712L);
beans.add(elasticSearchService.create(bean2));
Assert.assertNotNull("The beans created should not be null", beans);
Assert.assertEquals("The uniqueKeys of the fetched list should match the existing",
beans.stream()
.map(ElasticSearchBean::getUniqueKey)
.sorted((b1,b2) -> Long.compare(Long.parseLong(b2),Long.parseLong(b1)))
.collect(Collectors.toList()),
elasticSearchService.getFirstByIds(2).stream()
.map(ElasticSearchBean::getUniqueKey)
.collect(Collectors.toList())
);
}
Here is getFirstByIds(n):
#Override
public Collection<B> getFirstByIds(int entityCount) throws BeanPersistenceException {
assertBinding();
FilterContext filterContext = new FilterContext();
filterContext.setLimit(entityCount);
filterContext.setSort(Collections.singletonList(new FieldSort("uniqueKey",true)));
return Optional.ofNullable(find(filterContext)).orElseThrow();
}
Here is the find(filterContext):
#Override
public List<B> find(FilterContext filter) throws BeanPersistenceException {
assertBinding();
BoolQueryBuilder query = QueryBuilders.boolQuery();
List<FieldFilter> fields = filter.getFields();
StreamUtil.ofNullable(fields)
.forEach(fieldFilter -> executeFindSwitchCase(fieldFilter,query));
SearchSourceBuilder builder = new SearchSourceBuilder().query(query);
builder.from((int) filter.getFrom());
builder.size(((int) filter.getLimit() == -1) ? FILTER_LIMIT : (int) filter.getLimit());
SearchRequest request = new SearchRequest();
request.indices(index);
request.source(builder);
List<FieldSort> sorts = filter.getSort();
StreamUtil.ofNullable(sorts)
.forEach(fieldSort -> builder.sort(SortBuilders.fieldSort(fieldSort.getField()).order(
fieldSort.isAscending() ? SortOrder.ASC : SortOrder.DESC)));
try {
if (strict)
client.indices().refresh(new RefreshRequest(index), RequestOptions.DEFAULT);
SearchResponse response = client.search(request, RequestOptions.DEFAULT);
SearchHits hits = response.getHits();
List<B> results = new ArrayList<>();
for (SearchHit hit : hits)
results.add(objectMapper.readValue(hit.getSourceAsString(), clazz));
return results;
}
catch(IOException e){
logger.error(e.getMessage(),e);
}
return null;
}
The issue happens if I run the test case more than one time. The first time, the test passes fine but whenever we reach the second test, I get an exception :
ElasticsearchStatusException[Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]
]; nested: ElasticsearchException[Elasticsearch exception [type=illegal_argument_exception, reason=Fielddata is disabled on text fields by default. Set fielddata=true on [name] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.]];
After looking around for over a day, I've realized that the map gets changed from the original mapping (map specified at the beginning) and it gets automatically created with this :
"test": {
"aliases": {},
"mappings": {
"properties": {
"age": {
"type": "long"
},
"name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"timeStamp": {
"type": "long"
},
"uniqueKey": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
As I can see, the mapping changes automatically and throws the error.
Thanks for any help!
Elastic creates dynamic mapping only when no mapping exist for a field when documents are inserted. Check to see if the put mapping call happens before documents are added to index. If the mappings are applied statically, be sure the documents are inserted to the right index.
I'm having an issue implementing a Filter on a Projection that I have working in the Mongo Shell. I've got a Census object that contains a list of Employees.
{
"_id": "ID",
"name": "census1",
"employees": [ {
"eeId": "EE_ID1"
},
{
"eeId": "EE_ID2"
},
{
"eeId": "EE_ID3"
}
}
Realistically this could contain a lot of employees. So I'd like to be able to retrieve the main Census object, and a subset of employees. I've already implemented 'slice', so this is going to be retrieving a set of employees by their eeId.
This works fine:
db.census.aggregate(
[
{
$match: {
"_id": ObjectId("ID1")
}
},
{
$project: {
"censusName": 1,
"employees" : {
$filter : {
input: "$employees",
as: "employees",
cond: { $in: [ "$$employees.eeId", ["EE_ID1", "EE_ID3"]] }
}
}
}
}
]
).toArray()
The problem is, I can't get it implemented in Java. Here 'employeeIds' is a String of the IDs I want.
MatchOperation matchCensusIdStage = Aggregation.match(new Criteria("id").is(censusId));
ProjectionOperation projectStage = Aggregation.project("censusName")
.and(Filter.filter("employees")
.as("employees")
.by(In.arrayOf(employeeIds).containsValue("employees.eeId")))
.as("employees");
Aggregation aggregation = Aggregation.newAggregation(matchCensusIdStage, projectStage);
return mongoTemplate.aggregate(aggregation, Census.class, Census.class).getMappedResults().get(0);
For this, no results are returned. I've also tried implementing it with a BasicDBObject but got stuck there too.
EDIT (workaround):
I did get a solution using aggregation but not with the filter on the project. This is what I did:
db.parCensus.aggregate(
// Pipeline
[
{
$match: {
"_id": ObjectId("ID1")
}
},
{
$project: {
"_id": 0, "employee": "$employees"
}
},
{
$unwind: "$employee"
},
{
$match: {
"employee.eeId": { $in: ["EE_ID1", "EE_ID3"] }
}
}
]
).toArray()
Java Code:
MatchOperation matchCensusIdStage = Aggregation.match(new Criteria("id").is(censusId));
ProjectionOperation projectStage = Aggregation.project("censusName").and("employees").as("employee");
UnwindOperation unwindStage = Aggregation.unwind("employee");
MatchOperation matchEmployeeIdsStage = Aggregation.match(new Criteria("employee.eeId").in(employeeIds));
Aggregation aggregation = Aggregation.newAggregation(matchCensusIdStage, projectStage, unwindStage, matchEmployeeIdsStage);
I know I could add a $group at the end to put it back into one Census object, but I just created a separate CensusEmployee object to store it all.
The aggregation query posted in the question post works fine. The MongoDB Spring Data API for the aggregation ArrayOperators.In syntax is not clear. I couldn't implement a solution based on this aggregation (and no answers related to on the net).
But, the alternative solution is based on the following aggregation query - and it works fine.
db.collection.aggregate( [
{ $unwind: "$employees" },
{ $match: { "employees.eeId": { $in: ["EE_ID1", "EE_ID3"] } } },
{ $group: { _id: "$_id", name: { $first: "$name" }, employees: { $push: "$employees" } } }
] )
The Java code:
List<String> empsToMatch = Arrays.asList("EE_ID1", "EE_ID3");
MongoOperations mongoOps = new MongoTemplate(MongoClients.create(), "test");
Aggregation agg = newAggregation(
unwind("employees"),
match(Criteria.where("employees.eeId").in(empsToMatch )),
group("_id")
.first("name").as("name")
.push("employees").as("employees")
);
AggregationResults<Document> results = mongoOps.aggregate(agg, "collection", Document.class);
I have a REST controller that returns a list of products like so:
Current output
[
{
"id":1,
"name":"Money market"
},
{
"id":2,
"name":"Certificate of Deposit"
},
{
"id":3,
"name":"Personal Savings"
}
]
In order to get things working with our JS grid library, I need the modify the response to look like:
Desired output
{ "data" :
[
{
"id":1,
"name":"Money market"
},
{
"id":2,
"name":"Certificate of Deposit"
},
{
"id":3,
"name":"Personal Savings"
}
]
}
Controller
#RequestMapping(value = "/api/products", method = RequestMethod.GET)
public ResponseEntity<?> getAllProducts() {
List<Product> result = productService.findAll();
return ResponseEntity.ok(result);
}
Is there an easy way to modify the JSON response using native Spring libraries?
You can put result object into a Map with key "data" and value as result.
map.put("data", result);
Then return the map object from the rest method.
return ResponseEntity.ok(map);
Using org.json library:
JSONObject json = new JSONObject();
json.put("data", result);
The put methods add or replace values in an object.
I am having a problem when I want to be able to map a single Object but also an Array of those object with com.fasterxml.jackson.annotation
Please see below example keep in mind that this is a response payload and it is not under my control:
{
"GetSomeUserInfoDetails": {
"ItemListOfUser": {
"itemList": {
"item": [
{
"name": "Stack overflow",
"adress": "ola"
},
{
"name": "Google",
"adress": "man"
}
]
}
}
}
}
The jsontopojo is generating the classes that I can use for this response. The problem occurs when there is only one item int itemList user i get the following response:
{
"GetSomeUserInfoDetails": {
"WorkItem": {
"itemList": {
"item": {
"name": "Stack overflow",
"adress": "ola"
}
}
}
}
}
When you generate the classes now you will see a different class structure. Is there a way how we can solve this with Jackson?