I am trying to get Guava Caching working for my app. Specifically, I'm basically looking for a cache that behaves like a map:
// Here the keys are the User.getId() and the values are the respective User.
Map<Long, User> userCache = new HashMap<Long, User>();
From various online sources (docs, blogs, articles, etc.):
// My POJO.
public class User {
Long id;
String name;
// Lots of other properties.
}
public class UserCache {
LoadingCache _cache;
UserCacheLoader loader;
UserCacheRemovalListener listener;
UserCache() {
super();
this._cache = CacheBuilder.newBuilder()
.maximumSize(1000)
.expireAfterAccess(30, TimeUnit.SECONDS)
.removalListener(listener)
.build(loader);
}
User load(Long id) {
_cache.get(id);
}
}
class UserCacheLoader extends CacheLoader {
#Override
public Object load(Object key) throws Exception {
// ???
return null;
}
}
class UserCacheRemovalListener implements RemovalListener<String, String>{
#Override
public void onRemoval(RemovalNotification<String, String> notification) {
System.out.println("User with ID of " + notification.getKey() + " was removed from the cache.");
}
}
But I'm not sure how/where to specify that keys should be Long types, and cached values should be User instances. I'm also looking to implement a store(User) (basically a Map#put(K,V)) method as well as a getKeys() method that returns all the Long keys in the cache. Any ideas as to where I'm going awry?
Use generics:
class UserCacheLoader extends CacheLoader<Long, User> {
#Override
public User load(Long key) throws Exception {
// ???
}
}
store(User) can be implemented with Cache.put, just like you'd expect.
getKeys() can be implemented with cache.asMap().keySet().
You can (and should!) not only specify the return type of the overriden load method of CacheLoader to be User but also the onRemoval method argument to be:
class UserCacheRemovalListener implements RemovalListener<String, String>{
#Override
public void onRemoval(RemovalNotification<Long, User> notification) {
// ...
}
}
Related
I want to create a series of Actions that do related things
public interface Action{
public void execute();
}
public class DatabaseAction implements Action{
public void execute(){}
}
public class WebAction implements Action{
public void execute(){}
}
public class EmailAction implements Action{
public void execute(){}
}
Generally speaking, users don't care about the details. They want all the actions to run and not worry about it.
But there's going to be some special cases where they only want to run some of the actions, and configure some of the actions.
And I suppose there could be cases where configuration is non-optional.
I figure a fluent interface is the most readable here.
// Executes all Actions - intended to be used in almost all cases
// I write to a database, call a web API, and send an email.
Actions.withAllDefaults().execute();
// I don't need to send an email and I need to configure the database
Actions.withAction(DATABASE_ACTION)
.withConfiguration(DatabaseAction.PORT, 9000)
.withAction(WEB_ACTION)
.execute();
It feels like I should be implementing some sort of factory but it's hard for me to actually translate that into code.
Consider using the Fluent Builder Pattern instead of trying to make your factory fluent.
C# uses fluent programming extensively in LINQ to build queries using "standard query operators".
This is C# implementation. It looks like this sample code can be converted to Java as special features of C# is not used.
So let's see an example. We start from an interface IFluent which allows to build your actions with settings:
public interface IFluent
{
IFluent WithAction(Action action);
IFluent WithConfiguration(KeyValuePair<string, object> configuration);
}
and this is Fluent class which implements IFluent interface:
public class Fluent : IFluent
{
private List<Action> actions;
public IFluent WithAction(Action action)
{
if (actions == null)
actions = new List<Action>();
actions.Add(action);
return this;
}
public IFluent WithConfiguration(KeyValuePair<string, object> configuration)
{
if (actions == null || actions.Count == 0)
throw new InvalidOperationException("There are no actions");
int currentActionIndex = actions.Count - 1;
actions[currentActionIndex].Set(configuration);
return this;
}
}
Then we create an abstract class for Action that should define behavior for derived classes:
public abstract class Action
{
public abstract Dictionary<string, object> Properties { get; set; }
public abstract void Execute();
public abstract void Set(KeyValuePair<string, object> configuration);
public abstract void Add(string name, object value);
}
And our derived classes would look like this:
public class DatabaseAction : Abstract.Action
{
public override Dictionary<string, object> Properties { get; set; }
= new Dictionary<string, object>()
{
{ "port", 0},
{ "connectionString", "foobarConnectionString"},
{ "timeout", 60}
};
public override void Execute()
{
Console.WriteLine("It is a database action");
}
public override void Set(KeyValuePair<string, object> configuration)
{
if (Properties.ContainsKey(configuration.Key))
Properties[configuration.Key] = configuration.Value;
}
public override void Add(string name, object value)
{
Properties.Add(name, value);
}
}
and EmailAction:
public class EmailAction : Abstract.Action
{
public override Dictionary<string, object> Properties { get; set; }
= new Dictionary<string, object>()
{
{ "from", "Head First - Object Oriented Design"},
{ "to", "who wants to learn object oriented design"},
{ "index", 123456}
};
public override void Execute()
{
Console.WriteLine("It is a email action");
}
public override void Set(KeyValuePair<string, object> configuration)
{
if (Properties.ContainsKey(configuration.Key))
Properties[configuration.Key] = configuration.Value;
}
public override void Add(string name, object value)
{
Properties.Add(name, value);
}
}
and WebAction:
public class WebAction : Abstract.Action
{
public override Dictionary<string, object> Properties { get; set; }
= new Dictionary<string, object>()
{
{ "foo", "1"},
{ "bar", "2"},
{ "hey", "hi"}
};
public override void Execute()
{
Console.WriteLine("It is a email action");
}
public override void Set(KeyValuePair<string, object> configuration)
{
if (Properties.ContainsKey(configuration.Key))
Properties[configuration.Key] = configuration.Value;
}
public override void Add(string name, object value)
{
Properties.Add(name, value);
}
}
The it is possible to call code like this:
Fluent actions = new Fluent();
actions.WithAction(new DatabaseAction())
.WithConfiguration(new KeyValuePair<string, object>("port", 1))
.WithAction(new EmailAction())
.WithConfiguration(new KeyValuePair<string, object>("to", "me"));
I'm having a problem with inheritance. I've illustrated it with those account classes :
class Account {
accountLevel = BASIC;
connect() {...}
changePassword() { ... }
ChangeEmail() { ... }
}
class CustomerAccount extends Account {
accountLevel = CUSTOMER;
createOrder() { ... }
payOrder() { ... }
}
class AdminAccount extends Account {
accountLevel = ADMIN;
addProduct() { ... }
deleteProduct() { ... }
}
If I have an instance of an Account and I want to cast it to a AdminAccount to do addProduct() is it okay to do that :
Account account = new AdminAccount();
if(account.accountLevel == ADMIN){
AdminAccount adminAccount = (AdminAccount) account;
adminAccount.addProduct();
}
Edit:
I want to have all my accounts in the same List, because they all use the same connect method. But when the account is connected I still have an account object but I want to use their dynamic type methods, so i'm forced to cast the object.
It's almost like an instanceof but with extra steps. I feel like having to cast the account isn't very OOP, is there another more elegant solution ? Maybe not use inheritance between the accounts classes ?
One way to approach this is:
AdminAccount account = new AdminAccount();
account.addProduct();
If the expectation is only an Admin Account can add a product, relevant methods should expect an Admin Account in the input instead of an account. Is there a reason the instance is being created as an Account instead of an Admin Account?
Edit - Including sample packaging type handling into a manager class.
public static void main(String args[])
{
AccountManager manager = new AccountManager();
manager.addAccount(new User());
manager.addAccount(new User());
manager.addAccount(new Admin());
System.out.println(manager.getUsers().collect(Collectors.toList()));
}
public static final class AccountManager
{
private final List<Account> allAccounts = new ArrayList<>();
public Stream<User> getUsers(){ return getAccountType(User.class); }
public Stream<Admin> getAdmins(){ return getAccountType(Admin.class); }
public Stream<Account> getAccounts(){ return getAccountType(Account.class); }
public <T extends Account> void addAccount(T account) { if(account.connect()) allAccounts.add(account); }
#SuppressWarnings("unchecked") private <T> Stream<T> getAccountType(Class<T> type){ return allAccounts.stream().filter(type::isInstance).map(act -> (T) act); }
}
public static final class User extends Account{}
public static final class Admin extends Account{}
public static class Account { boolean connect(){ return true; } }
In general, it's considered not to be a good practice to have if statements by some type attribute or by instanceof checks.
An alternative to this is the visitor pattern. Here's a sketch of what you could do:
interface IAccount {
void process(AccountProcessor processor);
}
interface AccountProcessor {
void processBasicAccount(Account acc);
void processCustomerAccount(CustomerAccount acc);
void processAdminAccount(AdminAccount acc);
}
The idea is that you have an interface that defines a process method for all the accounts, that receives an AccountProcessor (you can change the names here, this is just to show an example). There should be one method for every different type of account. You could even use method overloading, i.e. all methods could be named something like processAccount and receive a specialized account type.
Now, your Account class and all its subtypes should implement the IAccount interface:
class Account implements IAccount {
#Override
void process(AccountProcessor processor) {
processor.processBasicAccount(this);
}
// all the other Account stuff
}
class CustomerAccount implements IAccount {
#Override
void process(AccountProcessor processor) {
processor.processCustomerAccount(this);
}
// all the other CustomerAccount stuff
}
class AdminAccount implements IAccount {
#Override
void process(AccountProcessor processor) {
processor.processAdminAccount(this);
}
// all the other AdminAccount stuff
}
Now, with all these pieces in place, you are ready to process your list of accounts:
class AccountProcessorImpl implements AccountProcessor {
private List<Account> accounts; // filled with all the accounts
void doSomethingWithAllTheAccounts() {
// iterate all the accounts and process each one of them
accounts.forEach(IAccount::process);
}
#Override
public void processBasicAccount(Account acc) {
// do something with the basic account
}
#Override
public void processCustomerAccount(CustomerAccount acc) {
// do something with the customer account
}
#Override
public void processAdminAccount(AdminAccount acc) {
// do something with the admin account
}
}
I hope you grasp the general idea. This is just a sketch to show you the core technique. There might be variants, i.e. you might have separate processors for each different type of account, or you might process the list of accounts in a class other than the AccountProcessorImpl, etc.
It's really OK. We call it polymorphism in OOP. Parents can be cast to all children.
This is related to java generic wild card.
I have interface like this.
public interface Processer<P, X> {
void process(P parent, X result);
}
An implementation like this.
public class FirstProcesser implements Processer<User, String> {
#Override
public void process(User parent, String result) {
}
}
And I'm using processer as this.
public class Executor {
private Processer<?, String> processer;
private int i;
public void setProcesser(Processer<?, String> processer) {
this.processer = processer;
}
private String generateString() {
return "String " + i++;
}
public <P> void execute(P parent) {
processer.process(parent, generateString());
}
public static void main(String[] args) {
Executor executor = new Executor();
executor.setProcesser(new FirstProcesser());
User user = new User();
executor.execute(user);
}
}
But here
public <P> void execute(P parent) {
processer.process(parent, generateString());
}
it gives compile error Error:(18, 27) java: incompatible types: P cannot be converted to capture#1 of ?
I need to understand why this give an error. also solution.
The wildcard basically means "I don't care which type is used here". In your case, you definitely do care though: the first type parameter of your processor must be the same as the P type in the execute method.
With the current code, you could call execute(1), which would try to call the FirstProcesser with an integer as argument, which obviously makes no sense, hence why the compiler forbids it.
The easiest solution would be to make your Executor class generic, instead of only the execute method:
public class Executor<P> {
private Processer<P, String> processer;
private int i;
public void setProcesser(Processer<P, String> processer) {
this.processer = processer;
}
private String generateString() {
return "String " + i++;
}
public void execute(P parent) {
processer.process(parent, generateString());
}
public static void main(String[] args) {
Executor executor = new Executor<User>();
executor.setProcesser(new FirstProcesser());
User user = new User();
executor.execute(user);
}
}
Because processor can have first type argument of anything. You may have assigned a Process<Foo, String> to it, and of course compiler will complain as it can be something different from P in your execute().
You may want to make your Executor a generic class:
class Executor<T> {
private Processer<T, String> processer;
public void setProcesser(Processer<T, String> processer) {
this.processer = processer;
}
public void execute(T parent) {
processer.process(parent, generateString());
}
}
and your main will look like:
Executor<User> executor = new Executor<User>();
executor.setProcesser(new FirstProcesser());
User user = new User();
executor.execute(user);
In response to comments:
There is no proper solution with proper use of Generics here, because what you are doing is contradicting: On one hand you say you do not care about first type argument of Processor (hence private Processor<?, String> processor), but on the other hand you DO really care about it (your execute). Compiler is simply doing its work right as it is absolutely legal for you to assign a Processor<Foo,String> to it.
If you don't really care about generics and is willing to suffer from poor design, then don't use generics.
Just keep Processor a raw type in Executor and suppress all unchecked warning:
i.e.
class Executor {
private Processor processor;
#SuppressWarnings("unchecked")
public void setProcessor(Processor<?, String> processor) {
this.processor = processor;
}
// your generic method does not do any meaningful check.
// just pass an Object to it
#SuppressWarnings("unchecked")
public void execute(Object parent) {
processor.process(parent, "");
}
}
And if it is me, I will go one step further:
Provide an Executor that is properly designed (e.g. calling it TypedExecutor). All new code should use the new, properly designed TypedExecutor. Original Executor is kept for sake of backward compatibility, and delegate its work to TypedExecutor.
Hence look like:
class TypedExecutor<T> {
private Processor<T, String> processor;
public void setProcessor(Processor<T, String> processor) {
this.processor = processor;
}
public void execute(T parent) {
processor.process(parent, "");
}
}
#SuppressWarnings("unchecked")
class Executor {
private TypedExecutor executor = new TypedExecutor();
public void setProcessor(Processor<?, String> processor) {
this.executor.setProcessor(processor);
}
public void execute(Object parent) {
this.executor.execute(parent);
}
}
I am trying to incorporate a data cache for one of my GWT widgets.
I have a datasource interface/class which retrieves some data from my backend via RequestBuilder and JSON. Because I display the widget multiple times I only want to retrieve the data once.
So I tried to come with an app cache. The naive approach is to use a HashMap in a singleton object to store the data. However I also want to make use of HTML5's localStorage/sessionStorage if supported.
HTML5 localStorage only supports String values. So I have to convert my object into JSON and store as a string. However somehow I can't come up with a nice clean way of doing this. here is what I have so far.
I define a interface with two functions: fetchStatsList() fetches the list of stats that can be displayed in the widget and fetchStatsData() fetches the actual data.
public interface DataSource {
public void fetchStatsData(Stat stat,FetchStatsDataCallback callback);
public void fetchStatsList(FetchStatsListCallback callback);
}
The Stat class is a simple Javascript Overlay class (JavaScriptObject) with some getters (getName(), etc)
I have a normal non-cachable implementation RequestBuilderDataSource of my DataSource which looks like the following:
public class RequestBuilderDataSource implements DataSource {
#Override
public void fetchStatsList(final FetchStatsListCallback callback) {
// create RequestBuilderRequest, retrieve response and parse JSON
callback.onFetchStatsList(stats);
}
#Override
public void fetchStatsData(List<Stat> stats,final FetchStatsDataCallback callback) {
String url = getStatUrl(stats);
//create RequestBuilderRquest, retrieve response and parse JSON
callback.onFetchStats(dataTable); //dataTable is of type DataTable
}
}
I left out most of the code for the RequestBuilder as it is quite straightforward.
This works out of the box however the list of stats and also the data is retrieved everytime even tough the data is shared among each widget instance.
For supporting caching I add a Cache interface and two Cache implementations (one for HTML5 localStorage and one for HashMap):
public interface Cache {
void put(Object key, Object value);
Object get(Object key);
void remove(Object key);
void clear();
}
I add a new class RequestBuilderCacheDataSource which extends the RequestBuilderDataSource and takes a Cache instance in its constructor.
public class RequestBuilderCacheDataSource extends RequestBuilderDataSource {
private final Cache cache;
publlic RequestBuilderCacheDataSource(final Cache cache) {
this.cache = cache;
}
#Override
public void fetchStatsList(final FetchStatsListCallback callback) {
Object value = cache.get("list");
if (value != null) {
callback.fetchStatsList((List<Stat>)value);
}
else {
super.fetchStatsList(stats,new FetchStatsListCallback() {
#Override
public void onFetchStatsList(List<Stat>stats) {
cache.put("list",stats);
callback.onFetchStatsList(stats);
}
});
super.fetchStatsList(callback);
}
}
#Override
public void fetchStatsData(List<Stat> stats,final FetchStatsDataCallback callback) {
String url = getStatUrl(stats);
Object value = cache.get(url);
if (value != null) {
callback.onFetchStatsData((DataTable)value);
}
else {
super.fetchStatsData(stats,new FetchStatsDataCallback() {
#Override
public void onFetchStatsData(DataTable dataTable) {
cache.put(url,dataTable);
callback.onFetchStatsData(dataTable);
}
});
}
}
}
Basically the new class will lookup the value in the Cache and if it is not found it will call the fetch function in the parent class and intercept the callback to put it into the cache and then call the actual callback.
So in order to support both HTML5 localstorage and normal JS HashMap storage I created two implementations of my Cache interface:
JS HashMap storage:
public class DefaultcacheImpl implements Cache {
private HashMap<Object, Object> map;
public DefaultCacheImpl() {
this.map = new HashMap<Object, Object>();
}
#Override
public void put(Object key, Object value) {
if (key == null) {
throw new NullPointerException("key is null");
}
if (value == null) {
throw new NullPointerException("value is null");
}
map.put(key, value);
}
#Override
public Object get(Object key) {
// Check for null as Cache should not store null values / keys
if (key == null) {
throw new NullPointerException("key is null");
}
return map.get(key);
}
#Override
public void remove(Object key) {
map.remove(key);
}
#Override
public void clear() {
map.clear();
}
}
HTML5 localStorage:
public class LocalStorageImpl implements Cache{
public static enum TYPE {LOCAL,SESSION}
private TYPE type;
private Storage cacheStorage = null;
public LocalStorageImpl(TYPE type) throws Exception {
this.type = type;
if (type == TYPE.LOCAL) {
cacheStorage = Storage.getLocalStorageIfSupported();
}
else {
cacheStorage = Storage.getSessionStorageIfSupported();
}
if (cacheStorage == null) {
throw new Exception("LocalStorage not supported");
}
}
#Override
public void put(Object key, Object value) {
//Convert Object (could be any arbitrary object) into JSON
String jsonData = null;
if (value instanceof List) { // in case it is a list of Stat objects
JSONArray array = new JSONArray();
int index = 0;
for (Object val:(List)value) {
array.set(index,new JSONObject((JavaScriptObject)val));
index = index +1;
}
jsonData = array.toString();
}
else // in case it is a DataTable
{
jsonData = new JSONObject((JavaScriptObject) value).toString();
}
cacheStorage.setItem(key.toString(), jsonData);
}
#Override
public Object get(Object key) {
if (key == null) {
throw new NullPointerException("key is null");
}
String jsonDataString = cacheStorage.getItem(key.toString());
if (jsonDataString == null) {
return null;
}
Object data = null;
Object jsonData = JsonUtils.safeEval(jsonDataString);
if (!key.equals("list"))
data = DataTable.create((JavaScriptObject)data);
else if (jsonData instanceof JsArray){
JsArray<GenomeStat> jsonStats = (JsArray<GenomeStat>)jsonData;
List<GenomeStat> stats = new ArrayList<GenomeStat>();
for (int i = 0;i<jsonStats.length();i++) {
stats.add(jsonStats.get(i));
}
data = (Object)stats;
}
return data;
}
#Override
public void remove(Object key) {
cacheStorage.removeItem(key.toString());
}
#Override
public void clear() {
cacheStorage.clear();
}
public TYPE getType() {
return type;
}
}
The post got a little bit long but hopefully clarifies what I try to reach. It boils down to two questions:
Feedback on the design/architecture of this approach (for example subclassing RequestBilderDataSource for cache function, etc). Can this be improved (this is probably more related to general design than specifically GWT).
With the DefaultCacheImpl it is really easy to store and retrieve any arbitrary objects. How can I achieve the same thing with localStorage where I have to convert and parse JSON? I am using a DataTable which requires to call the DataTable.create(JavaScriptObject jso) function to work. How can I solve this without to many if/else and instance of checks?
My first thoughts: make it two layers of cache, not two different caches. Start with the in-memory map, so no serialization/deserialization is needed for reading a given object out, and so that changing an object in one place changes it in all. Then rely on the local storage to keep data around for the next page load, avoiding the need for pulling data down from the server.
I'd tend to say skip session storage, since that doesn't last long, but it does have its benefits.
For storing/reading data, I'd encourage checking out AutoBeans instead of using JSOs. This way you could support any type of data (that can be stored as an autobean) and could pass in a Class param into the fetcher to specify what kind of data you will read from the server/cache, and decode the json to a bean in the same way. As an added bonus, autobeans are easier to define - no JSNI required. A method could look something like this (note that In DataSource and its impl, the signature is different).
public <T> void fetch(Class<T> type, List<Stat> stats, Callback<T, Throwable> callback);
That said, what is DataTable.create? If it is already a JSO, you can just cast to DataTable as you (probably) normally do when reading from the RequestBuilder data.
I would also encourage not returning a JSON array directly from the server, but wrapping it in an object, as a best practice to protect your users' data from being read by other sites. (Okay, on re-reading the issues, objects aren't great either). Rather than discussing it here, check out JSON security best practices?
So, all of that said, first define the data (not really sure how this data is intended to work, so just making up as I go)
public interface DataTable {
String getTableName();
void setTableName(String tableName);
}
public interface Stat {// not really clear on what this is supposed to offer
String getKey();
void setKey(String key);
String getValue();
String setValue(String value);
}
public interface TableCollection {
List<DataTable> getTables();
void setTables(List<DataTable> tables);
int getRemaining();//useful for not sending all if you have too much?
}
For autobeans, we define a factory that can create any of our data when given a Class instance and some data. Each of these methods can be used as a sort of constructor to create a new instance on the client, and the factory can be passed to AutoBeanCodex to decode data.
interface DataABF extends AutoBeanFactory {
AutoBean<DataTable> dataTable();
AutoBean<Stat> stat();
AutoBean<TableCollection> tableCollection();
}
Delegate all work of String<=>Object to AutoBeanCodex, but you probably want some simple wrapper around it to make it easy to call from both the html5 cache and from the RequestBuilder results. Quick example here:
public class AutoBeanSerializer {
private final AutoBeanFactory factory;
public AutoBeanSerializer(AutoBeanFactory factory) {
this.factory = factory;
}
public String <T> encodeData(T data) {
//first, get the autobean mapped to the data
//probably throw something if we can't find it
AutoBean<T> autoBean = AutoBeanUtils.getAutoBean(data);
//then, encode it
//no factory or type needed here since the AutoBean has those details
return AutoBeanCodex.encode(autoBean);
}
public <T> T decodeData(Class<T> dataType, String json) {
AutoBean<T> bean = AutoBeanCodex.decode(factory, dataType, json);
//unwrap the bean, and return the actual data
return bean.as();
}
}
Could you guys please help me find where I made a mistake ?
I switched from SimpleBeanEditorDriver to RequestFactoryEditorDriver and my code no longer saves full graph even though with() method is called. But it correctly loads full graph in the constructor.
Could it be caused by circular reference between OrganizationProxy and PersonProxy ? I don't know what else to think :( It worked with SimpleBeanEditorDriver though.
Below is my client code. Let me know if you want me to add sources of proxies to this question (or you can see them here).
public class NewOrderView extends Composite
{
interface Binder extends UiBinder<Widget, NewOrderView> {}
private static Binder uiBinder = GWT.create(Binder.class);
interface Driver extends RequestFactoryEditorDriver<OrganizationProxy, OrganizationEditor> {}
Driver driver = GWT.create(Driver.class);
#UiField
Button save;
#UiField
OrganizationEditor orgEditor;
AdminRequestFactory requestFactory;
AdminRequestFactory.OrderRequestContext requestContext;
OrganizationProxy organization;
public NewOrderView()
{
initWidget(uiBinder.createAndBindUi(this));
requestFactory = createFactory();
requestContext = requestFactory.contextOrder();
driver.initialize(requestFactory, orgEditor);
String[] paths = driver.getPaths();
createFactory().contextOrder().findOrganizationById(1).with(paths).fire(new Receiver<OrganizationProxy>()
{
#Override
public void onSuccess(OrganizationProxy response)
{
if (response == null)
{
organization = requestContext.create(OrganizationProxy.class);
organization.setContactPerson(requestContext.create(PersonProxy.class));
} else
organization = requestContext.edit(response);
driver.edit(organization, requestContext);
}
#Override
public void onFailure(ServerFailure error)
{
createConfirmationDialogBox(error.getMessage()).center();
}
});
}
private static AdminRequestFactory createFactory()
{
AdminRequestFactory factory = GWT.create(AdminRequestFactory.class);
factory.initialize(new SimpleEventBus());
return factory;
}
#UiHandler("save")
void buttonClick(ClickEvent e)
{
e.stopPropagation();
save.setEnabled(false);
try
{
AdminRequestFactory.OrderRequestContext ctx = (AdminRequestFactory.OrderRequestContext) driver.flush();
if (!driver.hasErrors())
{
// Link to each other
PersonProxy contactPerson = organization.getContactPerson();
contactPerson.setOrganization(organization);
String[] paths = driver.getPaths();
ctx.saveOrganization(organization).with(paths).fire(new Receiver<Void>()
{
#Override
public void onSuccess(Void arg0)
{
createConfirmationDialogBox("Saved!").center();
}
#Override
public void onFailure(ServerFailure error)
{
createConfirmationDialogBox(error.getMessage()).center();
}
});
}
} finally
{
save.setEnabled(true);
}
}
}
with() is only used for retrieval of information, so your with() use with a void return type is useless (but harmless).
Whether a full graph is persisted is entirely up to your server-side code, which is intimately bound to your persistence API (JPA, JDO, etc.)
First, check that the Organization object you receive in your save() method on the server-side is correctly populated. If it's not the case, check your Locators (and/or static findXxx methods) ; otherwise, check your save() method's code.
Judging from the code above, I can't see a reason why it wouldn't work.
It took me some time to realize that the problem was the composite id of Person entity.
Below is the code snippet of PojoLocator that is used by my proxy entities.
public class PojoLocator extends Locator<DatastoreObject, Long>
{
#Override
public DatastoreObject find(Class<? extends DatastoreObject> clazz, Long id)
{
}
#Override
public Long getId(DatastoreObject domainObject)
{
}
}
In order to fetch child entity from DataStore you need to have id of a parent class. In order to achieve that I switched "ID class" for Locator<> to String which represents textual form of Objectify's Key<> class.
Here is how to looks now:
public class PojoLocator extends Locator<DatastoreObject, String>
{
#Override
public DatastoreObject find(Class<? extends DatastoreObject> clazz, String id)
{
Key<DatastoreObject> key = Key.create(id);
return ofy.load(key);
}
#Override
public String getId(DatastoreObject domainObject)
{
if (domainObject.getId() != null)
{
Key<DatastoreObject> key = ofy.fact().getKey(domainObject);
return key.getString();
} else
return null;
}
}
Please note that your implementation may slightly differ because I'm using Objectify4.