Im having a specific error when running a scan expression on a table within my DynamoDB Database. There is one item in my Projects table as of now. This item contains a project description and a list of Strings of team mates. When retrieving the information for the project description my code prints out the correct name of the project. However when trying to retrieve the list of team mates from this same item it says that the list is a null object reference. I can not seem to understand why the list being returned is null. Assume all permissions have been set properly within the IAM console, and the database.
Below is the code for my thread which scans the table.
public void run() {
DynamoDBScanExpression scanExpression = new DynamoDBScanExpression();
//returns a list of items from the table. each item is of Project type
List<Project> scanResult=mapper.scan(Project.class,scanExpression);
//for each project within scanResult do the following
for (Project project: scanResult){
//retrieve the name of the team (this portion of the code logs the project name properly)
String team=project.getProjectname();
Log.v("team",team.toString());
//The list being returned from this one line below is null??
List<String> teammates=project.getTeammates();
Log.v("Teammate", teammates.get(0));
}
}
};
Thread mythread = new Thread(runnable);
mythread.start();
Below is the code for the Projects Class which serves as a template when scanning the table. This is most likely the area of the issue because the project description string is being returned properly, however the List of teamMates isnt. Perhaps I am not supposed to be using a List, or the List is not defined properly, or the use of Java annotations is not done correctly on this table !!!! However I can not find the issue
package com.example.varun.finalproject;
import com.amazonaws.mobileconnectors.dynamodbv2.dynamodbmapper.*;
import java.util.List;
/**
* Created by Varun on 4/10/17.
*/
#DynamoDBTable(tableName = "SBUProjects")
public class Project {
private String ProjectName;
private List<String> TeamMates;
#DynamoDBHashKey(attributeName = "ProjectName")
public String getProjectname() {
return ProjectName;
}
public void setProjectname(String projectName) {
this.ProjectName = projectName;
}
public List<String> getTeammates() {
return TeamMates;
}
public void setTeammates(List teammates) {
this.TeamMates= TeamMates;
}
}
Lastly here is a photo of my table and the item that contains a String for the project description and a List of Strings for my TeamMates. I assumed because the table determined teamMates to be a list I should also create a List when returning teamMates.
http://i67.tinypic.com/2qnm58h.jpg
Help would be appreciated.
You need to set annotation above the getTeammates() method, similar to how you set it for the getProjectname() method.
For example,
#DynamoDBAttribute(attributeName = "teammates")
Related
I have a requirement of using a java model which will have couple of attributes but some of them will have values and some of them will not have and this is not fixed. For example if it has 4 attribute 3 may have values while passing it to the controller method or it may so happen 2 of them will have values but then rest of the attributes will be null. So to handle this i choose to use Query by example of spring , but i am getting
java.lang.IllegalArgumentException : "Should not reach the end of iterator"
I am trying to fetch data from a Azure CosmosDB. Below is the code i have used
ExampleMatcher macther = ExampleMatcher.matching().withIgnoreNullValues();
Example<RequestModel> exampleQ = Example.of(new RequestModel(
req.getEmp(), // these are the attributes which can have alternatively values or can be empty
req.getBase(),
req.getSeat(),
req.getRent()
),matcher);
sampleRepo.findByEmpOrBaseOrSeatOrRent(exampleQ ); // here i am getting the exception
The Repository
public interface SampleRepo extends CosmosRepository<TableA,String>,BaseContainerRepo{
}
The container
#Container(containerName= "${container-tableA}")
public class TableA extends BaseContainer{
}
Base model class
public class BaseContainer{
#Id
private String id;
private Inetger emp;
#PartitionKey
private String key;
private String base;
private String eqp;
}
The base container repo
public interface BaseContainerRepo{
List<BaseContainer> findByEmpOrBaseOrSeatOrRent(Example<RequestModel> exampleQ);
}
Can anyone please let me know where i am doing it wrong .
I'm having trouble converting between java.sql.Timestamp and java.time.Instant using JOOQ converters.
Here's a simplified version of the code I'm working with.
public class User {
private static final Converter<Timestamp, Instant> MY_CONVERTER= Converter.of(
Timestamp.class,
Instant.class,
t -> t == null ? null : t.toInstant(),
i -> i == null ? null : Timestamp.from(i)
)
public static Table<?> table = DSL.table("user");
public static Field<String> name = DSL.field(DSL.name(table.getName(), "name"), String.class);
public static Field<Instant> name = DSL.field(DSL.name(table.getCreated(), "created"), SQLDataType.TIMESTAMP.asConvertedDataType(Converter.of(MY_CONVERTER)));
}
private class UserDto {
private String name;
private Instant created;
// getters, setters, etc.
}
public class UserWriter {
// constructor with injected DefaultDSLContext etc..
public void create(UserDto user) {
dslContext.insertInto(User.table, User.firstName, User.lastName)
.values(user.getName(), user.getCreated())
.execute();
}
}
public class UserReader {
// constructor with injected DefaultDSLContext etc..
public Result<Record> getAll() {
return dslContext.select().from(User.table).fetch();
}
}
public class UserService {
// constructor with injected UserReader etc..
public Collection<UserDto> getAll() {
return userReader
.getAll()
.stream()
.map(Users::from)
.collect(Collectors.toList());
}
}
public class Users {
public static UserDto from(Record record) {
UserDto user = new UserDto();
user.setName(record.get(User.name));
user.setCreated(record.get(User.created);
return user;
}
}
When I create a new User the converter is called and the insertion works fine. However, when I select the Users the converter isn't called and the record.get(User.created) call in the Users::from method returns a Timestamp (and therefore fails as UserDto.setCreated expects an Instant).
Any ideas?
Thanks!
Why the converter isn't applied
From the way you phrased your question (you didn't post the exact SELECT statement that you've tried), I'm assuming you didn't pass all the column expressions explicitly. But then, how would jOOQ be able to find out what columns your table has? You declared some column expressions in some class, but that class isn't following any structure known to jOOQ. The only way to get jOOQ to fetch all known columns is to make them known to jOOQ, using code generation (see below).
You could, of course,let User extend the internal org.jooq.impl.TableImpl class and use internal API to register the Field values. But why do that manually, if you can generate this code?
Code generation
I'll repeat the main point of my previous question, which is: Please use the code generator. I've now written an entire article on why you should do this. Once jOOQ knows all of your meta data via code generation, you can just automatically select all columns like this:
UserRecord user = ctx
.selectFrom(USER)
.where(USER.ID.eq(...))
.fetchOne();
Not just that, you can also configure your data types as INSTANT using a <forcedType>, so you don't need to worry about data type conversion every time.
I cannot stress this enough, and I'm frequently surprised how many projects try to use jOOQ without code generation, which removes so much of jOOQ's power. The main reason to not use code generation is if your schema is dynamic, but since you have that User class, it obviously isn't dynamic.
I am trying to read data from an azure table using the below code in an android project.
TableQuery<Observation> rangeQuery =
TableQuery.from(Observation.class)
.where(combinedFilter);
Iterable<Observation> results = cloudTable.execute(rangeQuery);
// Loop through the results, displaying information about the entity
for (Observation entity : results) {
res.add(entity);
}
As soon as I try to enumerate results it throws java.lang.NoClassDefFoundError: com.fasterxml.jackson.core.JsonFactory exception.
A table entity looks like this:
{"PartitionKey":"temperature",
"RowKey":"2014-12-19 23:15:19",
"Timestamp":"2014-12-19T23:15:20.2638537Z",
"humidity":38.0,
"temp":22.0,
"datetime":"2014-12-19 23:15:19"}
And the corresponding class is:
public class Observation extends TableServiceEntity {
String temp;
String humidity;
String datetime;
String PartitionKey;
String RowKey;
String Timestamp;
}
I suspect this is a serialization error. But I can't see anything wrong since all the properties are implemented in the Observation class.
Seems like the Azure SDK doesn't install one of its dependency. It can be downloaded from: jackson-core
I have a bean in my Fusion Web Application where I'm supposed to insert new data into a table of my database through java code (after appropriate validation).
The question is how should I do the insertion?
Should I use Entity Objects?
How?
P.S.: This is not the way it should work http://jneelmani.blogspot.com/2009/11/adf-insert-using-storeprocedure.html
I created Entity Object and View Object by the database table "Employees" and then created application module where included this view object (also were generated java classes for entity object, view object and appModule. EmployeeInfo is just POJO). Inside the application module I created methods:
public EmployeeViewRowImpl saveEmployee(EmployeeInfo EmployeeInfo) {
// Получаем ViewObject
EmployeeViewImpl employeeView = getEmployeeView1();
// Готовим новую строку.
EmployeeViewRowImpl employee = createEmployeeViewRowImpl(employeeView, employeeInfo);
// Производим операцию вставки.
employeeView.insertRow(employee);
// Коммитим
try {
getDBTransaction().commit();
return employee;
} catch (JboException e) {
getDBTransaction().rollback();
return null;
}
}
private EmployeeViewRowImpl createEmployeeViewRowImpl(EmployeeViewImpl employeeView, EmployeeInfo employeeInfo) {
EmployeeViewRowImpl employee = (EmployeeViewRowImpl)EmployeeView.createRow();
employee.setName(employeeInfo.getName());
return employee;
}
And to use this one should just call:
public static AppModuleImpl getApp() {
return (AppModuleImpl)Configuration.
createRootApplicationModule(
"com.test.service.AppModule", // where your module is stored
"AppModuleShared"); // chosen configuration
}
and then
...
RegistrationAppModuleImpl app = getApp();
app.saveUser(userInfo)
...
May be i'm not to clear on the dynamics of what you are trying to do, but with Oracle ADF, CRUD operations (such as Insert), are easily handled by exposing them from Data Controls. To be more specific, once you have an EO, you should create a View Object and an Application Module. After that, inside the AppMod -> Data Model , add the created VO. This way it will be exposed in the Data Controls panel, and you can expand the 'Operations' folder, and drag'n'drop the CreateInsert operation possibly within a form, or an updatable table.
Please refer to this link: CreateInsert Operation - ADF.
If for some other reason you want to handle this process in a programmatic approach, i might think about two possible ways:
1. Get into your managed bean code an instance of the above mentioned AppMod, and from that, a VO instance.
AppModule mod = AppModule)Configuration.createRootApplicationModule("packageName.AppModule", "AppModuleLocal");
ViewObject vo = mod.getViewObject1();After that, create a new row and commit the newly added values.
2. If you have already exposed a UI component (such a table), you can grab the Binding Context of the current page and from the table's iterator, create a new row.
DCBindingContainer DCB = (DCBindingContainer)BindingContext.getCurrent().getCurrentBindingsEntry();
DCIteratorBinding iterator = bc.findIteratorBinding("ViewObject1Iterator");
Row r = iterator.getCurrentRow();
r.setAttribute("attibName", attribValue);
You can do the insertion using entity object as below:
/* Create a new Customer and Return the new id */
public long createCustomer(String name, String city, Integer countryId) {
EntityDefImpl customerDef = CustomerImpl.getDefinitionObject();
CustomerImpl newCustomer =
(CustomerImpl)customerDef.createInstance2(getDBTransaction(),null);
newCustomer.setName(name);
newCustomer.setName(name);
newCustomer.setCountryId(countryId);
try {
getDBTransaction().commit();
}
catch (JboException ex) {
getDBTransaction().rollback();
throw ex;
}
DBSequence newIdAssigned = newCustomer.getId();
}
Im using GreenDAO and Volley. So I have the following problem: When I make a network request I need to parse with GSON so I have a model to represent entities retrieved from server and other model to represent the GreenDAO objects. Is there any way to only have 1 class per model to represent as a GSON and a Class of ORM?
class Product:
#SerializedName("id")
private String id;
#SerializedName("pictures")
private List<Picture> pictures;
get & set
class PersistentProduct:
private Long id;
private List<Picture> pictures;
/** To-many relationship, resolved on first access (and after reset). Changes to to-many relations are not persisted, make changes to the target entity. */
public List<PersistencePicture> getPictures() {
if (pictures == null) {
if (daoSession == null) {
throw new DaoException("Entity is detached from DAO context");
}
PersistencePictureDao targetDao = daoSession.getPersistencePictureDao();
List<PersistencePicture> picturesNew = targetDao._queryPersistenceProduct_Pictures(id);
synchronized (this) {
if(pictures == null) {
pictures = picturesNew;
}
}
}
return pictures;
}
First I thought to make a Interface, but when you retrieve the data from a DAO the DAO returns the class and not the interface, so I think cannot do in this way, the only solution I found is to make a "ProductUtils" that converts from a "PersistentProduct" to a "Product" and vice versa.
The most elegant way would be to implement a small extension for greendao, so that you can specify the serialized name during schema-creation.
For Example:
de.greenrobot.daogenerator.Property.java:
// in PropertyBuilder append these lines
public PropertyBuilder setSerializedName(String sname) {
// Check the sname on correctness (i.e. not empty, not containing illegal characters)
property.serializedName = sname;
return this;
}
// in Property append these lines
private String serializedName = null;
public boolean isSerialized() {
return serializedName != null;
}
In entity.ftl add this line after line 24 (after package ${entity.javaPackage};):
<#if property.serializedName??>
import com.google.gson.annotations.SerializedName;
</#if>
And after line 55 (after: <#list entity.properties as property>)
<#if property.serializedName??>
#SerializedName("${property.serializedName}")
</#if>
Afterwards you should be able to use you generated greendao-entity for volley with the following restrictions:
If you get a Product over network, nothing is changed in the db, yet. You have to call insertOrReplace().
If you get a Product from db and send it via network some undesired fields might be serialized (i.e. myDao and daoSession)
If you get a Product via network and call insertOrReplace() the "network"-Product will be persisted and a already existing Product will be replaced by it BUT the referenced entities won't get updated or persisted if insertOrReplace() isn't called for each of them!
If you get a Product via network and call insertOrReplace() for every referenced entity toMany-entities that were referenced by the db-Product are still referenced by the updated Product, although they are not listed in the updated Product. You have to call resetPictures() and getPictures() to get the correct list, which will contain all toMany()-entities references by either the original Product stored in DB or the updated Product from network.
Update addressing 2.
To prevent daoSession and myDao from being serialized, you can use the following ExclusionStrategy:
private static class TransientExclusionStrategy implements ExclusionStrategy {
public boolean shouldSkipClass(Class<?> clazz) {
return (clazz.getModifiers() & java.lang.reflect.Modifier.TRANSIENT) != 0;
}
public boolean shouldSkipField(FieldAttributes f) {
return f.hasModifier(java.lang.reflect.Modifier.TRANSIENT);
}
}
Update addressing 1.,3. and 4.
As a fast solution you can add the following method in the KEEP-SECTIONS of your entity:
public void merge(DaoSession s) {
s.insertOrReplace(this);
// do this for all toMany-relations accordingly
for (Picture p : getPictures()) {
s.insertOrReplace(p);
newPics.add(p.getId());
}
resetPictures();
}
This will result in the original entity being updated and attached to the session and dao. Also every Picture that is references by the network-product will be persisted or updated. Pictures reference by the original entity, but not by the network-entity remain untouched and get merged into the list.
This is far from perfect, but it shows where to go and what to do. The next steps would be to do everything that is done in merge() inside one transaction and then to integrate different merge-methods into dao.ftl.
NOTE
The code given in this answer is neither complete nor tested and is meant as a hint on how to solve this. As pointed out above this solution still has some restrictions, that have to be dealt with.