Google cloud datastore runquery get as date variable? - java

I'm using java. This my code
RunQueryResponse response = dataset.runQuery("project_name", queryrequest).execute();
and response.tostring(). I have all the query I want but there many.
How to get a single value with each field. Like put it in an array or something we can call with for loop or interator.
Thanks
----add code--------
Here some my code:
Iterator<EntityResult> entity_interator = response.getBatch().getEntityResults().iterator();
Map<String, Property> entity;
while(entity_interator.hasNext()){
entity = entity_interator.next().getEntity().getProperties();
String first = entity.get("First").toString();
String last = entity.get("Last").toString();
String time = entity.get("Time").toString();
System.out.println(first);
System.out.println(last);
System.out.println(time);
}
and response:
{"values":[{"stringValue":"first name"}]}
{"values":[{"stringValue":"last name"}]}
{"values":[{"dateTimeValue":"2013-08-28T08:21:58.498Z"}]}
How can I get time as date time varible and first name and last name with out {"values":[{"stringValue":" and all junk thing.

Each Property may contain one or more Value objects, and each Value can contain one of several different value types (each type has its own field). If your properties are single-valued, you can just take the first one from the list:
Iterator<EntityResult> entity_interator = response.getBatch().getEntityResults().iterator();
Map<String, Property> entity;
while(entity_interator.hasNext()){
entity = entity_interator.next().getEntity().getProperties();
String first = entity.get("First").getValues().get(0).getStringValue();
String last = entity.get("Last").getValues().get(0).getStringValue();
DateTime dateTime = entity.get("Time").getValues().get(0).getDateTimeValue();
Date time = new Date(dateTime.getValue());
System.out.println(first);
System.out.println(last);
System.out.println(time);
}
Note that this is an example of using the JSON API (full JavaDoc reference). The samples in the Google Cloud Datastore docs are using the Protocol Buffers API which is structured in the same way but has slightly different syntax.

Related

Play Framework #ValidateWithPayload

I'm trying to pass a payload to a validate(ValidationPayload) in the Play Framework using Java. I can not access the values stored in payload.getAttrs() which returns a TypedMap.
I tried to access the Cookies by calling in the validate method payload.getAttrs().getOptional(TypedKey.create("Cookies")) but I always get a null back.
When I evaluate expression using IntelliJ I see the attrs contain Cookies, Flash and more. But I can not access theses values. I can see the values in the Expression Evaluator screenshot
public String validate(Constraints.ValidationPayload payload) {
TypedMap attrs = payload.getAttrs();
Optional<Object> baseDomain = payload.getAttrs().getOptional(TypedKey.create("baseDomain"));
Locale value = payload.getAttrs().get(TypedKey.create("selectedLang"));
return "String";
}
How can I access these objects stored in a TypedMap?
I figured this out a TypedMap map uses TypedKeys. A typed key is unique to each INSTANCE of the key.
That means you need to fetch from the typedMap with the same instance of the key that was used to put in the map. Creating a new key will cause an empty or null response.
This should work:
TypedKey<String> baseDomainKey = TypedKey.create("baseDomain")
payload.getAttrs().addAttrs(baseDomainKey, "domain")
String domain = payload.getAttrs().get(baseDomainKey)
This will not work however:
TypedKey<String> baseDomainKey = TypedKey.create("baseDomain")
payload.getAttrs().addAttrs(baseDomainKey, "domain")
String domain = payload.getAttrs().get(TypedKey.create("baseDomain"))

Get column value from Ebean model row/record using variable in Play Framework

I am trying to return a value from a row returned from an Ebean model in the Play Framework.
I retrieve the row/record with this:
public static IntakeEdit findByEditKey(String editKey) {
return find.where().eq("editkey", editKey).findUnique();
}
I am cycling through an array with the column names and want to grab that column value. I would like to use the name as variable and get the value that way.
Something along the lines of:
IntakeEdit intakeEdit = IntakeEdit.findByEditKey(editkey);
String fieldValue = "";
String columnToGet = "projectid"
fieldValue = intakeEdit.getColumnValue(columnToGet);
Can this be done? Or is there another method/function I could use?
I appreciate the help!

Hazelcast: Predicate Performance Issue

I am facing a performance issue in hazelcast while using the Predicate on the hazelcast map.
So I have a model class as shown below:
public class MyAccount {
private String account;
private Date startDate;
private Date endDate;
private MyEnum accountTypeEnum;
// Overrides equals and hascodes using all the attributes
// Has getters and setters for all the attributes
}
Then I create a hazelcast instance of type Hazelcast<MyAccount, String>. And in that instance I start saving the MyAccount object as key and associated string as it's value.
Point to note: I am saving these accounts in different maps (let say local, state, national and international)
Approx 180,000 objects of MyAccount is created and saved in the hazelcast, in different maps depending upon the account's geographical position. Apart from these, hazelcast stores another 50,000 string objects as keys and values in different maps (excluding the maps mentioned above)
Then I have a method which uses predicate filters on the attributes account, startDate and endDate to filter out accounts. Lets call this method as filter.
public static Predicate filter(String account, Date date) {
EntryObject entryObject = new PredicateBuilder().getEntryObject();
PredicateBuilder accountPredicate = entryObject.key().get(Constants.ACCOUNT).equal(account);
PredicateBuilder startDatePredicate = entryObject.key().get(Constants.START_DATE).isNull().or(entryObject.key().get(Constants.START_DATE).lessEqual(date));
PredicateBuilder endDatePredicate = entryObject.key().get(Constants.END_DATE).isNull().or(entryObject.key().get(Constants.END_DATE).greaterThan(date));
return accountPredicate.and(effectiveDatePredicate.and(endDatePredicate));
}
private void addIndexesToHazelcast() {
Arrays.asList("LOCAL", "STATE", "NATIONAL", "INTERNATIONAL").forEach(location -> {
IMap<Object, Object> map = hazelcastInstance.getMap(location);
map.addIndex("__key." + "startDate", true);
map.addIndex("__key." + "endDate", true);
map.addIndex("__key." + "account", false);
});
}
Issue: For a particular map, say local, which holds around 80,000 objects, when I use the predicate to fetch the values from the map, it takes around 4 - 7 seconds which is unacceptable.
Predicate predicate = filter(account, date);
String value = hazelcast.getMap(mapKey).values(predicate); // This line takes 4-7 secs
I am surprised that the cache should take 4 - 7 seconds to fetch the value for one single account given that I have added index in the hazelcast maps for the same attributes. This is a massive performance blow.
Could anybody please let me know why is this happening ?

Design for large scale parameter validation for JPA?

I have a method that takes in a JSON and takes out the data and distributes it to various strings so that they can be set in an entity and persisted. My example below is quite simple but for my actual code I have about 20+ fields
For example see
public Projects createProject(JsonObject jsonInst) {
Projects projectInst = new Projects();
String pId = jsonInst.get("proId").getAsString();
String pName = jsonInst.get("proName").getAsString();
String pStatus = jsonInst.get("proStatus").getAsString();
String pCustId = jsonInst.get("proCustId").getAsString();
String pStartDate = jsonInst.get("proStartDate").getAsString();
...
//Set the entity data
projectInst.setProjectId(pId);
projectInst.setProjectName(pName);
...
Notice if a varible dosent have a corrosponding entry in the Json this code will break with null pointer exception. Obviously I need to validate each parameter befopre calling .getAsString()
What is the best way to do this from a readability point of view I could create 2 varibles for each parameter and check and set for example.
if(jsonInst.get("proName")){
String pName = jsonInst.get("proName").getAsString();
}
Or should I wait for it to be set
if(!pName.isEmpty()){
projectInst.setName(pName)
}
...
Which of these do you think is the best parameter to use for preventing errors.
Is there a way to handle if something is set on a large scale so that I can reduce the amount of code I have to write before I use that varible?
You can create a method that will take field name as parameter and will return json value for that field :
private String getJSONData(String field,JsonObject json){
String data=null;
if(json.has(field)){
data=json.get(field).getAsString();
}
return data;
}
you can call this method for each of your field:
String pId = getJSONData("proId",jsonInst);
By this way you can not only escape NullPointerException, but also avoid code repetition.

Java LinkedHashMap: what's the difference in these two?

EDIT: The entire code and database creation script can be found from http://gitorious.org/scheator . The database script is in Schema/.
I have the following Java code:
A LinkedHashMap defined in an abstract class as
LinkedHashMap<Object, Data> list;
A descendant class that initializes this list like this:
list = new LinkedHashMap<Integer, Data>();
I add items like this:
String id = rs.getString(FIELDS[0]);
String name = rs.getString(FIELDS[1]);
Data team = new Data(Integer.parseInt(id.trim()), name);
list.put(id, team);
Now when I do this:
System.err.println("delete() count: " + list.size());
System.err.println("delete() key value " + key);
Data obj;
obj = (Data)list.remove(key);
deletedList.put(key, obj);
System.err.println("delete() count: " + list.size());
Nothing is removed from the list, i.e. the first and last prints print the same size(). The key is also correct (I have verified there is an item by that id).
However, and this is my question, if I add the values like this:
Integer id = rs.getInt(FIELDS[0]);
String name = rs.getString(FIELDS[1]);
Data team = new Data(id, name);
list.put(id, team);
The code works! Shouldn't parseInt() produce a similar key to getInt()? Why does the second version work but the first doesn't? I spent a good hour debugging this until I found the reason and I still can't figure out the reason.
First example:
String id = rs.getString(FIELDS[0]);
Second example:
Integer id = rs.getInt(FIELDS[0]);
I can't say for sure since I can't see the rest of the code, but if the key variable is an Integer in this call:
obj = (Data)list.remove(key);
then the remove will only work if the object was put into the map using an Integer and that is why it is only working when the id is integer when you call the put method. The String "123" does not equal the integer 123.
Also I am assuming that you just missed a line in your first example but there was no call to list.put(id, team) but that could also be the source of your problems
There should be no difference, but there are a number of things that are not clear from your example:
deletedList does not refer to the list object
the records in your database that are being used are the same in both cases (perhaps in the first a different int is being used that is already in the Map)
Autoboxing may also be complicating the issue. Replace Integer id in the second sample with int id to pass the same arguments to your Data constructor.
Maybe you could post up the complete code such that we can reproduce the scenario accurately?
Update
You are using String values as keys in your original code. You then have an Object key in your remove(key) method, so I expect you are passing an Integer to the method at this point. A String will not match an Integer as a key, which explains why your remove was not working.
If you use generics to specify your HashMap (LinkedHashMap<Integer, Team> instead of <Object, Team>) this kind of error can't happen - the compiler will say something like
The method put(Integer, Object) in the type HashMap<Integer,Object> is not applicable for the arguments (String, String)
Yanamon is right. It's pretty clear when you look at the diff:
while (rs.next()) {
- String id = rs.getString(FIELDS[0]);
+ Integer id = rs.getInt(FIELDS[0]);
String name = rs.getString(FIELDS[1]);
- Data team = new Data(Integer.parseInt(id.trim()), name);
+ Data team = new Data(id, name);
list.put(id, team);
Note that in the original version, an int (auto-boxed to Integer) is being passed into the Data constructor. But id, which is being putted, is still a String.
My guess is that int the second case you cast it explicitly into an Integer
Integer id = rs.getInt(FIELDS[0]);
while on the first case it remains an int
Integer.parseInt(id.trim())
from the javadoc of parseInt
static int parseInt(String s)
Parses the string argument as a signed decimal integer.
If I were you I would inspect the contents of the LinkedHashMap using a debugger, before and after your put and before and after your remove. Step into the remove() method (the source code is part of the JDK) and see what it is doing. Odds are your code is not adding or removing the object correctly. It's hard to see here because the code sample is incomplete.
As for rs.getInt() and Integer.parseInt(), the first is database-vendor specific (I assume rs is a ResultSet), and thus they may not have the same behaviour. However, once the Integer key is created (you can verify this with your debugger) it should be equivalent for HashMap or LinkedHashMap purposes. But your code sample complicates things further; you are using rs.getString() and then Integer.parseInt(). While I would be surprised if this happened, it's possible that the database driver is formatting the id column into a string that confuses parseInt(). To me it's far more readable to just do rs.getInt().

Categories

Resources