Get custom object by id from json java jsonpath - java

I'am trying to get User object from json by Jayway JsonPath.
My class is:
public class User{
private String id;
private String name;
private String password;
private String email;
/*getters setters constructor*/
}
And json example:
{
"user": [
{
"id": "1",
"login": "client1",
"password": "qwerty",
"email": "client#gmail.com"
}
]
}
I wanna to get smth like this:
public Optional<User> find(String id) throws NoSuchEntityException {
Optional<User> user = Optional.empty();
try{
Path path = Path.of(fileDestination+fileName);
ReadContext ctx = JsonPath.parse(Files.readString(path));
User readUser = ctx.read("$..user[*]",User.class,Filter.filter(where("id").is(id)));
user = Optional.ofNullable(readUser);
} catch (IOException e) {
e.printStackTrace();
}
return user;
}
Or to get good advice how to code it:D

two things here
you must use the placeholder ? instead of * when you're filtering, otherwise, the filter will be ignored.
read() will return a list, not a single object.
so, I think, you need something like this
String json = "{\r\n"
+ " \"user\": [\r\n"
+ " {\r\n"
+ " \"id\": \"1\",\r\n"
+ " \"login\": \"client1\",\r\n"
+ " \"password\": \"qwerty\",\r\n"
+ " \"email\": \"client#gmail.com\"\r\n"
+ " }\r\n"
+ " ]\r\n"
+ "}";
Predicate filterById = filter(where("id").is("1"));
List<User> users = JsonPath.parse(json).read("$.user[?]",filterById );
System.out.println(users);
Reference: Filter Predicates
where("category").is("fiction").and("price").lte(10D) );
List<Map<String, Object>> books =
parse(json).read("$.store.book[?]", cheapFictionFilter);
Notice the placeholder ? for the filter in the path. When multiple filters are provided they are applied in order where the number of placeholders must match the number of provided filters. You can specify multiple predicate placeholders in one filter operation [?, ?], both predicates must match.

Related

Aggregation Annotation Not Working It is Returning Empty

I am trying to make this Aggregation bellow using Spring Java, but it's returning an Empty List on my console. This is working on MongoDB, but not on my spring application.
This is my MongoDB Query:
db.collection.aggregate([
{
$match: {
$and: [
{
user_id: "256f5280-fb49-4ad6-b7f5-65c4329d46e0"
},
{
time: {
$gte: 1622471890,
$lt: 1822471890
}
}
]
}
},
{
$sort: {
time: -1
}
}
])
It's return me 2 results:
[
{
"_id": ObjectId("5a934e000102030405000001"),
"message": "This is an example of my db (1)",
"time": 1.62247189e+09,
"user_id": "256f5280-fb49-4ad6-b7f5-65c4329d46e0"
},
{
"_id": ObjectId("5a934e000102030405000002"),
"message": "This is an example of my db (2)",
"time": 1.62247189e+09,
"user_id": "256f5280-fb49-4ad6-b7f5-65c4329d46e0"
}
]
My Problem is when I do this using Mongo with Spring Java:
My Repository
#Aggregation(pipeline = {
"{" +
"$match: {" +
"$and: [" +
"{" +
"user_id: ?2" +
"}," +
"{" +
"time: {" +
"$gte: ?0," +
"$lt: ?1" +
"}" +
"}" +
"]" +
"}" +
"}",
"{" +
"$sort: {" +
"time: -1" +
"}" +
"}"
})
List<MessagesUser> findByPeriodAndUserIdPaginated(long from, long to, String user_id, Pageable pageable);
A part of My Service
#Override
public Page<MessagesUser> findAllBetween(Date from, Date to, Pageable page, String user_id) {
List<MessagesUser> messagesUserList = messagesUserRepository.findByPeriodAndUserIdPaginated(from.getTime() / 1000, to.getTime() / 1000, user_id, page);
System.out.println("messagesUserList: ");
System.out.println(messagesUserList); // It is the empty list
This Java Query Repository is returning to me an empty array instead of my 2 values. You should see the Dataset and an example working well on MongoDB Query here: Mongo playground
There is a problem with your annotation
You have used
"user_id: ?2" +
but user_id is a 4th param then you need to use
"user_id: ?3" +
After Gibbs answer I had an idea (to check if query was executing the correct variables) and I found the problem, but it was in my #Document collection name.
First I added Query Logs on my console application to see if it was working well, and It was!
To check logs I Added this line on my application.properties
logging.level.org.springframework.data.mongodb.core.MongoTemplate=DEBUG
And I saw that Query was correct by log, but the collection name that the Query executing was incorrect, It was executing on messagesUser instead messagesuser, so I changed the collection name in my Entity MessagesUser and it worked.
After (It solved and It's Working)
#Document(collection="messagesuser")
public class MessagesUser{
public MessagesUser(String user_id, Long time) {
this.user_id = user_id;
this.time = time;
}
// [continuation of the entity code...]
Before (It wasn't working)
#Document
public class MessagesUser{
public MessagesUser(String user_id, Long time) {
this.user_id = user_id;
this.time = time;
}
// [continuation of the entity code...]

Return substring of analyzed, non-stored text field in elasticsearch java api

I work on a project that has a string field (the name is urlOrContent) and it can be small (less than 50 character) or very long (more than 50 character), and I just want to return the first 50 characters every time based on a specific query. My database is elasticsearch and my problem is raised in this link and the questioner’s response seems to be correct (urlOrContent field is analyzed and non stored text field). It uses following script:
{
"script_fields": {
"substring": {
"script": {
"lang": "painless",
"inline": "params._source.text.substring(0, 100)"
}
}
}
}
But my main problem is that I can not find the equivalent of elasticsearch java api code. In fact, what should be added to the code below, which only returns the first 50 characters of the urlOrContent field? Note that this field may not even have 50 characters in some cases, and then the entire string should be returned.
String queryString =
EnumLinkFields.CREATE_TIME.getFieldName() + ":(>=" + dateFrom + " AND <=" + dateTo + ")";
QueryBuilder query = QueryBuilders.queryStringQuery(queryString);
SearchResponse response = TRANSPORT_CLIENT.prepareSearch(MY_INDEX)
.setTypes(MY_TYPE)
.setSearchType(SEARCH_TYPE)
.setQuery(query)
.setFetchSource(null, new String[]{EnumLinkFields.USER_ID.getFieldName()})
.setFrom(offset)
.setSize(count)
.addSort(orderByField, sortOrder)
.execute().actionGet();
I found the best answer.
String queryString =
EnumLinkFields.CREATE_TIME.getFieldName() + ":(>=" + dateFrom + " AND <=" + dateTo + ")";
QueryBuilder query = QueryBuilders.queryStringQuery(queryString);
String codeUrlOrContent = "if (" + EnumElasticScriptField.URL_OR_CONTENT.getFieldName() + ".length() > 50) {" +
"return " + EnumElasticScriptField.URL_OR_CONTENT.getFieldName() + ".substring(0, 50);" +
"} else { " +
"return " + EnumElasticScriptField.URL_OR_CONTENT.getFieldName() + "; }";
Script scriptUrlOrContent = new Script(ScriptType.INLINE, "painless",
codeUrlOrContent, Collections.emptyMap());
Script scriptIsUrl = new Script(ScriptType.INLINE, "painless",
EnumElasticScriptField.IS_URL.getFieldName(), Collections.emptyMap());
SearchResponse response = TRANSPORT_CLIENT.prepareSearch(MY_INDEX)
.setTypes(MY_TYPE)
.setSearchType(SEARCH_TYPE)
.setQuery(query)
.addScriptField(EnumLinkFields.URL_OR_CONTENT.getFieldName(),
scriptUrlOrContent)
.addScriptField(EnumLinkFields.IS_URL.getFieldName(), scriptIsUrl)
.setFrom(offset)
.setSize(count)
.addSort(orderByField, sortOrder)
.execute().actionGet();
Note that the call to the setFetchSource function must be removed and all returned fields must be returned through the script.
You can put your script_fields query in the query object, i.e. in setQuery(query).
Your query object should be looking like this right now.
"query" : {
"term" : { "user" : "kimchy" }
}
After you add the script_fields in the object, it should become:
"query" : {
"term" : { "user" : "kimchy" }
},
"script_fields": {
"urlOrContent": {
"script": {
"lang": "painless",
"inline": "if(params._source.urlOrContent.length() > 50){
params._source.urlOrContent.substring(0, 50)
}
else {
params._source.urlOrContent
}"
}
}
}
The resulting hits will have a fields array with the substring you required.
You have to enable scripting by changing the elasticsearch.yml file like so and restart the elasticsearch:
script.engine.painless.inline.aggs: on
script.engine.painless.inline.update: on
script.inline: on
script.indexed: on

Date-value is two days behind inserted value after sending via REST and saving with JPA

I have a Angular Frontend with Java Backend and I'd like to use PrimeNG's DateInput to receive and save a Date. I got the following code for that:
<p-calendar [(ngModel)]="enddate" dateFormat="dd.mm.yy" class="medium-field"></p-calendar>
and in my component:
enddate: Date;
When sending this value via REST, I have the following code (including check):
createPSP(project_num: string, financialItem: FinancialItem) {
this.logger.info("Startdate is " + financialItem.startDate);
this.logger.info("Enddate is " + financialItem.endDate);
let order_num = financialItem.orderNumber;
let psp_num = financialItem.pspNumber;
return this.http.post(`${httpBaseUrl}/project/${project_num}/order/${order_num}/addFinancialItem/${psp_num}`, financialItem).pipe();
}
Output (which is right):
Then I save it in my backend in a LocalDate-variable (which converts to date-type in MySQL). Now what happens is that I insert 31.12.2018 and I get (in my Converter in backend) 30.12.2018 and in MySQL 29.12.2018. Why is that the case?
Edit: When I change LocalDate to LocalDateTime I am just getting 30.12.2018 in MySQL instead of 29.12.2018 which is obviously still wrong.
Some more code:
I defined my MySQL-Column like this (in my Entity):
#Entity
class FinancialItemDto {
//...
#Column(name = "ENDDATE", nullable = false)
private LocalDate endDate;
}
In the Controller:
public ResponseEntity addFinancialItem(#PathVariable String project_num, #PathVariable String order_num,
#PathVariable String psp_num, #RequestBody FinancialItemDto financialItemDto) {
try {
this.financialItemService.saveItemGivenProjectAndOrderNumber(financialItemDto, order_num);
} catch (NoSuchEntityException e) {
this.logger.error(e.getMessage());
return ResponseEntity.status(HttpStatus.CONFLICT).body(e.getUserMessage());
}
return ResponseEntity.ok(HttpStatus.OK);
}
In the Service:
#Transactional
#Override
public void saveItemGivenProjectAndOrderNumber(FinancialItemDto financialItemDto, String orderNumber)
throws NoSuchEntityException {
OrderEntity order = this.orderRepository.findByOrderNumber(orderNumber).orElseThrow(
() -> new NoSuchEntityException("Order with number " + orderNumber + " could not be found.",
"Der Abruf wurde nicht gefunden."));
OrdertypesEntity ordertype = this.ordertypesRepository.findByShorthand(financialItemDto.getOrderType())
.orElseThrow(() -> new NoSuchEntityException("Ordertype " + financialItemDto.getOrderType()
+ " for creating FI to order " + orderNumber + " could not be found.",
"Der Abruftyp wurde nicht gefunden."));
FinancialItemEntity financialItemEntity = FinancialItemConverter.dtoToEntity(financialItemDto, ordertype,
order);
this.entityManager.persist(financialItemEntity);
}
The Dto on TS-Side defines the date like follows:
export class FinancialItem {
endDate: Date;
//...
}
My Converter just passes on:
public static FinancialItemEntity dtoToEntity(FinancialItemDto financialItemDto, OrdertypesEntity ordertype, OrderEntity order) {
FinancialItemEntity financialItemEntity = new FinancialItemEntity( (...), financialItemDto.getEndDate(), (...));
LoggerFactory.getLogger(FinancialItemConverter.class).info("Got Date Value: " + financialItemDto.getEndDate()); //gives: Got Date Value: 2018-12-30 instead of value 31.12.2018
return financialItemEntity;
}
Update:
A workaround for the REST-Service to "loose" one day was saving the date in long-format and then pass this and convert it back. Sadly, when calling repository.save()-function, the logger calls that when I inserted 2018-12-01 the date-value is also 2018-12-01, but MySQL says it is 2018-11-30 in my database. I cannot make out what is happening there:
this.logger.info("Startdate is " + financialItemDto.getStartDate()); //2018-12-01
this.logger.info("Startdate is " + financialItemEntity.getStartDate()); //2018-12-01
this.financialItemRepository.save(financialItemEntity);
this.logger.info("Startdate (received) is " + this.financialItemRepository.getFinancialItemByPSPNumber(financialItemDto.getPspNumber()).get().getStartDate()); //2018-12-01

ArangoDB Java Driver: put multiple parameters in transaction

I need to put multiple parameters in transaction in JAVA drived for ArangoDB;
It works with single parameter:
public String save(User user) throws ArangoDBException {
TransactionOptions options = new TransactionOptions().params(user).writeCollections(collectionName);
String action = "function (params) { "
+ "var db = require('internal').db; "
+ "var doc = params;"
+ "db.users.save(doc);"
+ "}";
return db.transaction(action, String.class, options);
}
But if I need to pass multiple parameters, then I'm stuck. Tried to pass map, arraylist or array, but it doesn't seem to work:
public void save(User user, User user2) throws ArangoDBException {
Map<String, Object> parameters = new MapBuilder()
.put("user", user)
.put("user2" user2)
.get();
TransactionOptions options = new TransactionOptions().params(parameters).writeCollections(collectionName);
String action = "function (params) { "
+ "var db = require('internal').db; "
+ "var doc = params['user'];"
+ "var doc2 = params['user2'];"
+ "db.users.save(doc);"
+ "db.users.save(doc2);"
+ "}";
db.transaction(action, String.class, options);
}
The need of your workaround is not necessary any more. The missing automatic serialization of map/list/array within TransactionOptions was a bug in the java-driver which is fixed with version 4.1.5
Had to serialize the map:
TransactionOptions().params(db.util().serialize(params)).writeCollections(collectionName, "users2");

Using JsonAnySetter and JsonAnyGetter with ArrayList within Hashmap

So I'm trying to get my head around using Jackson Annotations, and i'm making requests against Riot's API. This is the response I'm getting: http://jsonblob.com/568079c8e4b01190df45d254. Where the array after the summonerId (38584682) can be of a varying length.
The unique summoner ID will be different every single time too.
I want to map this response to a DTO.
For a similar situation with a different call I am doing:
#JsonIgnore
protected Map<String, SingleSummonerBasicDTO> nonMappedAttributes;
#JsonAnyGetter
public Map<String, SingleSummonerBasicDTO> getNonMappedAttributes() {
return nonMappedAttributes;
}
#JsonAnySetter
public void setNonMappedAttributes(String key, SingleSummonerBasicDTO value) {
if (nonMappedAttributes == null) {
nonMappedAttributes = new HashMap<String, SingleSummonerBasicDTO>();
}
if (key != null) {
if (value != null) {
nonMappedAttributes.put(key, value);
} else {
nonMappedAttributes.remove(key);
}
}
}
From an answer on here. my thinking is to do a for-each loop for each of the elements in the array, but I don't know how to loop over something without having something to loop over.
I am completely stuck as to how the annotations work and how to proceed, any help if appreciated!
First of all, #JsonAnySetter was meant to deal with the case of varying properties, not for json arrays of varying length.
Jackson is quite capable of using Java Collections and Maps in serialization and deserialization. You just have to tell it the parameter type of the collection.
In your case, I have used a Map to capture the root element, making it the sole key with a List of DTOs as value. I use Jackson's type system (TypeFactory and JavaType) to tell jackson of all generic types.
This is the DTO that I have used:
public class SingleSummonerBasicDTO
{
public String name;
public String tier;
public String queue;
public List<SingleSummonerBasicDTOEntry> entries;
#Override
public String toString() {
String toString = "\nSingleSummonerBasicDTO: " + name + " " + tier + " " + queue;
for (SingleSummonerBasicDTOEntry entry : entries) {
toString += "\n" + entry.toString();
}
return toString;
}
public static class SingleSummonerBasicDTOEntry
{
public String playerOrTeamId;
public String playerOrTeamName;
public String division;
public int leaguePoints;
public int wins;
public int losses;
public boolean isHotStreak;
public boolean isVeteran;
public boolean isFreshBlood;
public boolean isInactive;
#Override
public String toString() {
return "Entry: " + playerOrTeamId + " " + playerOrTeamName + " " + division + " " + leaguePoints + " " + wins + " " +
losses + " " + isHotStreak + " " + isVeteran + " " + isInactive;
}
}
this is how to deserialise:
public static void main(String[] args)
{
ObjectMapper mapper = new ObjectMapper();
TypeFactory factory = mapper.getTypeFactory();
// type of key of response map
JavaType stringType = factory.constructType(String.class);
// type of value of response map
JavaType listOfDtosType = factory.constructCollectionLikeType(ArrayList.class, SingleSummonerBasicDTO.class);
// create type of map
JavaType responseType = factory.constructMapLikeType(HashMap.class, stringType, listOfDtosType);
try (InputStream is = new FileInputStream("C://Temp/xx.json")) {
Map<String, List<SingleSummonerBasicDTO>> response = new ObjectMapper().readValue(is, responseType);
System.out.println(response);
} catch (IOException e) {
e.printStackTrace();
}
}
output:
{38584682=[
SingleSummonerBasicDTO: Viktor's Masterminds PLATINUM RANKED_SOLO_5x5
Entry: 38584682 Lazrkiller V 64 291 295 false true false,
SingleSummonerBasicDTO: Renekton's Horde SILVER RANKED_TEAM_5x5
Entry: TEAM-ff7d0db0-78ca-11e4-b402-c81f66dba0e7 Y U NO BABAR II 0 4 2 false false false,
SingleSummonerBasicDTO: Pantheon's Chosen SILVER RANKED_TEAM_5x5
Entry: TEAM-d32018f0-d998-11e4-bfd2-c81f66dba0e7 Lose and Throw Away I 66 7 0 false false false,
SingleSummonerBasicDTO: Jayce's Duelists SILVER RANKED_TEAM_5x5
Entry: TEAM-6c8fc440-a8ac-11e4-b65b-c81f66db920c TopBlokesNeverToke III 0 20 18 false false false]}
Here is the way how to parse your json:
Map<String, List<SingleSummonerBasicDTO>> summonersMap = new ObjectMapper()
.readValue(json, new TypeReference<HashMap<String, List<SingleSummonerBasicDTO>>>() {});

Categories

Resources