For instance i have some entity - Product
public class Product {
...
private String name;
private int count;
private Product associatedProduct;
...
// GETTERS & SETTERS
}
And also i have product finder which allows to find the product by filters:
public interface Finder<T> {
Set<T> find(Filter... filters);
}
And now i can execute the following code:
Finder<Product> finder = ...;
// find all products with name 'cucumber'
Set<Product> finder.find(Filter.equals("name", "cucumber"));
We don't like this code because i should have the 'soft' link to field name "name" and i can't have compile time exception in case misprint or any other mistake.
For this reason i have created the code generator which generate static links to properties.
The generated class looks like:
public final class $Product {
private final String context;
// some factory is used to instance creation
$PostEntity() {this.context = "";}
$PostEntity(String context) {this.context = context;}
public String name() { return context + "name";}
public String count() { return context + "count";}
public String associatedProduct() { return context + "associatedProduct";}
public $Product associatedProductDot() { return new $Product( this.context + "associatedProduct.");}
}
For now i can make the following:
Set<Product> finder.find(Filter.equals(Links.PRODUCT.name() , "cucumber"));
//or
Set<Product> finder.find(Filter.equals(Links.PRODUCT.associatedProductDot().name() , "cucumber"));
It works like a charm and i happy.
I know alternative approach with using proxy objects, but it imposes additional overhead in runtime and adds some magical moment in code, so this variant does not suit me.
And finally my question:
There is a more elegant approach to implement this functionality with using java 8?
Java 8 has everything you need:
public static <C,P> Predicate<C> byProperty(Function<C,P> f, P value) {
return component->Objects.equals(f.apply(component), value);
}
public static <C> Set<C> find(Collection<? extends C> c, Predicate<? super C> p) {
return c.stream().filter(p).collect(Collectors.<C>toSet());
}
The standard interface for filtering is called Predicate and the first method above allows you to create arbitrary Predicates for matching a property of a component type C. The second method shows how you can get a Set of matching components out of a Collection using the Stream API. Then you can use it like this:
List<Product> list;
…
Set<Product> set=find(list, byProperty(Product::getName, "foo"));
or
Set<Product> set=find(list, byProperty(Product::getCount, 42));
Note that this is type safe and contains compile-time checked references (your “hard links”) to your properties. The only difference to what you have asked for is that they refer to the getter method rather than to the field names, as a) field references are not supported and b) your fields are private anyway.
Note that you can augment these methods by another factory allowing to provide a value-predicate rather than a constant:
public static <C,P> Predicate<C> matchProp(
Function<C,P> f, Predicate<? super P> value) {
return component->value.test(f.apply(component));
}
This allows use cases like:
Set<Product> set=find(list, matchProp(Product::getCount, count -> count>100));
See Lambda Expressions
or
Set<Product> set=find(list, matchProp(Product::getName, String::isEmpty));
The fastest thing is to provide your own implementation of the Filter interface. Since I don't know your Filter interface, I have to make an assumption about how it looks like. Here is my assumption:
public interface Filter<T> {
boolean matches(T t);
}
By the way, I think the interface Finder should instead look like this:
public interface Finder<T> {
Set<T> find(Filter<? super T>... filters);
}
So, you could have a class like this:
public final class ProductFilters {
private ProductFilters() { /* Utility class */ }
public static Filter<Product> byName(final String name) {
return new Filter() {
public boolean matches(Product t) {
return name.equals(t.getName());
}
}
}
}
You could even put this inside class Product, which can make it a little bit nicer:
public class Product {
private String name;
public static final class Filters {
private Filters() { /* Utility Class */ }
public static Filter<Product> byName(final String name) {
return new Filter() {
public boolean matches(final Product t) {
return name.equals(t.name);
}
};
}
}
}
And yes, Java 8 makes this stuff nicer, the explicit anonymous class can syntactically be replaced by a lambda, like this:
public class Product {
private String name;
public static final class Filters {
private Filters() { /* Utility Class */ }
public static Filter<Product> byName(final String name) {
return t -> name.equals(t.name);
}
}
}
Your code that uses the filters could now look like this:
Set<Product> cucumbers = finder.find(Product.Filters.byName("cucumber"));
The Filter<T> interface is present in Java 8 in package java.util.function. It's name there is Predicate<T>, and the essential part looks like this:
public interface Predicate<T> {
boolean test(T t);
}
If the products that are to be filtered can be made available as Stream either directly, or via a Collection, you can use the new java.util.stream API for filtering. For the example I assume that the products to be filtered are in a Set, too. The code that filters products could look like this:
Set<Product> potentialCucumbers = ...;
// Inline lambda:
Set<Product> cucumbers = potentialCucumbers.stream().filter(p -> "cucumber".equals(p.getName())).collect(Collectors.toSet());
// Stored lambda as above:
Set<Product> cucumbers = potentialCucumbers.stream().filter(Product.Filters.byName("cucumber")).collect(Collectors.toSet());
I really like static imports for that stuff as they can significantly reduce line length. With static imports it looks like this:
Set<Product> potentialCucumbers = ...;
// Inline lambda:
Set<Product> cucumbers = potentialCucumbers.stream().filter(p -> "cucumber".equals(p.getName())).collect(toSet());
// Stored lambda as above:
Set<Product> cucumbers = potentialCucumbers.stream().filter(byName("cucumber")).collect(toSet());
My suggestion would be to use predicates rather than your Filter classes. They make for cleaner code. I would also suggest making commonly used properties like "name" or "owner" into interfaces that provide predicates for searchability. For instance, for the "name" and "owner" properties you might have two interfaces called "Named" and "Owned":
public interface Named {
public String getName();
public void setName(String name);
static <T extends Named> Predicate<T> nameEquals(Class<T> clazz, String s){
return ((p) -> {
if (s == null){
return p.getName() == null;
}
return s.equals(p.getName());
});
}
}
public interface Owned {
public String getOwner();
public void setOwner(String owner);
public static <T extends Owned> Predicate<T> ownerEquals(Class<T> clazz, String s){
return ((p) -> {
if (s == null){
return p.getOwner() == null;
}
return s.equals(p.getOwner());
});
}
}
Then your Product class implements these interfaces, along with a couple simple convenience methods for calling the interface static methods:
public class Product implements Named, Owned{
private String name;
private String owner;
public String getOwner() {
return owner;
}
public String getName() {
return name;
}
public void setOwner(String owner){
this.owner = owner;
}
public void setName(String name){
this.name = name;
}
public static Predicate<Product> nameEquals(String s){
return Named.nameEquals(Product.class, s);
}
public static Predicate<Product> ownerEquals(String s){
return Owned.ownerEquals(Product.class, s);
}
}
And voila, your Product is searchable. Then your find() method's signature changes to take a predicate:
public interface Finder<T> {
Set<T> find(Predicate p);
}
One of the wonderful things about predicates is how easy they are to combine and compound with one another. For example, let's say we want to find() any products named "cucumber" who aren't owned by "john", or any products owned by "john" with any other names. The call to find() is pretty clean and understandable:
finder.find(
Product.nameEquals("cucumber")
.and(Product.ownerEquals("john").negate())
.or(
Product.ownerEquals("john")
.and(Product.nameEquals("cucumber").negate())
)
);
I should be pretty clear what this block of code is doing. I used the indentation to try and make it clearer how they combine. We can combine the different predicates to our hearts' content.
Related
Firstly apologies about the not so great title, I am new to Java and wasn't sure how to title this.
I have a interface class "TestInterface":
ublic interface TestInterface {
String getForename();
void setForename(String forename);
String getSurname();
void setSurname(String surname);
}
"TestImpl" implements "TestInterface":
public class TestImpl implements TestInterface{
private String forename;
private String surname;
#Override
public String getForename() {
return forename;
}
public void setForename(String forename) {
this.forename = forename;
}
#Override
public String getSurname() {
return surname;
}
public void setSurname(String surname) {
this.surname = surname;
}
}
Then I have a call called "ExtendTest" which extends "TestImpl":
public class ExtendTest extends TestImpl{
private String firstLineAddress;
public String getFirstLineAddress() {
return firstLineAddress;
}
public void setFirstLineAddress(String firstLineAddress) {
this.firstLineAddress = firstLineAddress;
}
}
I then have this "Entity" class:
import java.util.List;
public class Entity {
private List<TestInterface> testInterfaces;
private List<ExtendTest> extendTests;
public List<TestInterface> getTestInterfaces() {
return testInterfaces;
}
public void setTestInterfaces(List<TestInterface> testInterfaces) {
this.testInterfaces = testInterfaces;
}
public List<ExtendTest> getExtendTests() {
return extendTests;
}
public void setExtendTests(List<ExtendTest> extendTests) {
this.extendTests = extendTests;
}
}
and finally this "DoStuff" class where the dostuff method accepts a parameter of type List
import java.util.List;
public class DoStuff {
public void doStuff(List<TestInterface> testData) {
}
}
I try to test this like so:
public class Main {
public static void main(String[] args) {
System.out.println("Hello, World!");
DoStuff doStuff = new DoStuff();
Entity entity = new Entity();
// Works
doStuff.doStuff(entity.getTestInterfaces());
// Does not work
doStuff.doStuff(entity.getExtendTests());
}
}
However where the comment is "Does not work" their is an error
Required type:
List<TestInterface>
Provided:
List<ExtendTest>
My question is how do I make it so that I can pass it in. My understanding was that becase they all implement TestInterface that it would work but I think I am wrong with this.
Thanks for any help and learnings here :)
You've run afoul of PECS. I recommend reading the linked answer for a more detailed explanation, but here's the bits specific to your use case.
When you have a generic type (List, in your case), if you only read from it, you should write List<? extends MyInterface>. If you only write to it, you should write List<? super MyInterface>. If you do both, then you want List<MyInterface>. Why do we do this? Well, look at your code.
public void doStuff(List<TestInterface> testData) { ... }
This function takes a List<TestInterface>. The List interface has a ton of capability. You can add and remove things to it in addition to just reading from it. And doStuff expects a list of TestInterface. So it's entirely fair game for the implementation of doStuff to do
testData.add(new ClassIJustMadeUp());
assuming ClassIJustMadeUp implements TestInterface. So we definitely can't pass this function a List<ExtendTest>, since that list type can't contain ClassIJustMadeUp.
However, if your function does only read from the list and isn't planning to add anything to it, you can write the signature as
public void doStuff(List<? extends TestInterface> testData) { ... }
and now you can pass a List of any type which extends TestInterface. It's fine to read from this list, since any type which extends TestInterface clearly can be upcast safely to TestInterface. But if we try to add a list element, that's a compiler error since the list doesn't necessarily support that particular type.
Lets say I have a class to model an item in a game like so:
public class Item {
private final EnumItem type;
public Item(EnumItem type) {
this.type = type;
}
public Item(String name) {
this.type = EnumItem.fromName(name);
}
}
public enum EnumItem {
MACHINE_GUN("machine_gun"),
SWORD("sword"),
BAT("bat"),
DEFAULT("default");
private final String name;
public EnumItem(name) {
this.name = name;
}
public String getName() { return name; }
public static EnumItem fromName(String name) {
for(EnumItem i: EnumItem.values()) {
if(i.name.equals(name)) {
return i;
} else {
return EnumItem.DEFAULT;
}
}
}
}
Assume that .equals() and .hashCode() of Item are overridden correctly to compare the internal Enum.
Now I want a way to distinguish these items with a getter in Item: should I return an Enum or the String name? Is it good practice to return an Enum in general? Or is there a better way to distinguish these Items? Because returning the enum kind of looks like exposing the rep to me and I don't want my colleagues to use EnumItem directly to compare Items.
The approaches I thought of are the following:
string getName() to do something like item1.getName().equals("machine_gun");
EnumItem getEnum() to do item1.getEnum().equals(EnumItem.MACHINE_GUN);
item1.equals(new Item("machine_gun"));
static name(String name) { new Item(name) } to do item1.equals(Item.name("machine_gun"));
I don't know what should I do, I'd appreciate some insight from experienced programmers.
I know they look like they would from context, but in my use case these items have no special functionality that would justify extending from the base Item class.
Is this good practice? Sure, you're using aggregation since Item doesn't depend on EnumItem, which is fine. That being said, could it be done better? Sure. Is the alternative I provide the only solution? No.
Alternative
If you want this to be extensible, consider using an interface to represent an item. Then allow the interface to extend this interface to provide some standard types. Alternatively you could use composition or aggregation to define a type inside EnumItem that implements the Item interface to ensure that equals/hashcode for the Item are always override and adhere to some contract.
interface Item {
String key();
}
enum EnumItem implement Item {
private final String key;
EnumItem(String key) {
this.key = key;
}
#Override
public String key() {
return key;
}
}
class AbstractItem implements Item {
// constructor, override name()
}
Item item = EnumItem.FOO_BAR;
Item item2 = new AbstractItem("FooBar");
Item item3 = () -> "FooBar";
I have a class with some 20+ fields of the same type that are populated during different stages of the object lifecycle.
One of the class methods should return the field value based on the field name.
So far I have something like this:
public String getFieldValue(String fieldName){
switch (fieldName.toLowerCase(){
case "id": return getId();
case "name": return getName();
.....
the problem with this is high cyclomatic complexity.
What would be the easiest way to tackle this?
Edit: Thanks to #Filippo Possenti for his comment
Instead of a switch, you can use a Map.
Here is an example.
static interface C {
String getA();
String getB();
String getC();
}
#FunctionalInterface
static interface FieldGetter {
String get(C c);
}
static Map<String, FieldGetter> fields = Map.of(
"a", C::getA,
"b", C::getB,
"c", C::getC
);
static String getField(C object, String fieldNameToRetrieve) {
var getter = fields.get(fieldNameToRetrieve);
if(getter == null) {
throw new IllegalArgumentException("unknown field");
}
return getter.get(object);
}
Why don't you use reflexion or an existing library for this ? (Or why do you even have this kind of method)
In theory you could reduce the getFieldValue() method complexity by:
storing the getter method reference as Producer<?> in Map<String, Producer<?>>
using reflection to lookup fields
using 3rd party library that supports querying the bean by property name e.g. commons-beanutils.
Each of these approaches will however increase the getFieldValue() method complexity and potentially reduce the performance. Both are worse problems than high complexity.
It feels like you should review why you need the getFieldValue() method in the first place, maybe it should be a Map<String, ?>?
Assuming that the fieldName possible values match the getters on the bean, you can use Apache's BeanUtils:
https://commons.apache.org/proper/commons-beanutils/apidocs/org/apache/commons/beanutils/PropertyUtils.html#getSimpleProperty-java.lang.Object-java.lang.String-
Basically, you could do something like this:
public String getFieldValue(String fieldName){
return PropertyUtils.getSimpleProperty(fieldName.toLowerCase());
}
This is more about improving code readability than improving cyclomatic complexity so if it's pure performance what you're after, this may not be your solution.
If pure performance is what you're after, you could try and leverage lambdas and a Map.
import java.util.Map;
import java.util.HashMap;
import java.util.function.Function;
public class HelloWorld{
public static class MyClass {
private static Map<String, Function<MyClass, Object>> descriptor;
static {
descriptor = new HashMap<>();
descriptor.put("id", MyClass::getId);
descriptor.put("name", MyClass::getName);
}
private String id;
private String name;
public String getId() {
return id;
}
public String getName() {
return name;
}
public void setId(String value) {
id = value;
}
public void setName(String value) {
name = value;
}
public Object getFieldValue(String fieldName) {
Function fn = descriptor.get(fieldName);
return fn.apply(this);
}
}
public static void main(String []args){
MyClass mc = new MyClass();
mc.setId("hello");
mc.setName("world");
System.out.println(mc.getFieldValue("id") + " " + mc.getFieldValue("name"));
}
}
To note that in the above example the cyclomatic complexity is somewhat still there, but it's moved in the class' static initialiser. This means that you'll suffer a modest penalty during application startup but enjoy higher performance in subsequent calls of getFieldValue.
Also, if performance is what you're after you may want to eliminate the need for toLowerCase... which in my example I removed.
Instead of the switch or using a Map, you can use an enum.
enum FieldExtractor implements Function<YourClass, String> {
ID(YourClass::getId),
NAME(YourClass::getName); // and so on
private final Function<YourClass, String> delegate;
FieldExtractor(Function<YourClass, String> delegate) {
this.delegate = delegate;
}
#Override public String apply(YourClass extractFrom) {
return delegate.apply(extractFrom);
}
static FieldExtractor fromString(String name) {
return Stream.of(FieldExtractor.values())
.filter(fe -> fe.name().equalsIgnoreCase(name))
.findFirst()
.orElseThrow(IllegalArgumentException::new);
}
}
Now you can use
public String getFieldValue(String fieldName) {
return FieldExtractor.fromString(fieldName).apply(this);
}
in your client code.
I'm writing a library, which has a predefined set of values for an enum.
Let say, my enum looks as below.
public enum EnumClass {
FIRST("first"),
SECOND("second"),
THIRD("third");
private String httpMethodType;
}
Now the client, who is using this library may need to add few more values. Let say, the client needs to add CUSTOM_FIRST and CUSTOM_SECOND. This is not overwriting any existing values, but makes the enum having 5 values.
After this, I should be able to use something like <? extends EnumClass> to have 5 constant possibilities.
What would be the best approach to achieve this?
You cannot have an enum extend another enum, and you cannot "add" values to an existing enum through inheritance.
However, enums can implement interfaces.
What I would do is have the original enum implement a marker interface (i.e. no method declarations), then your client could create their own enum implementing the same interface.
Then your enum values would be referred to by their common interface.
In order to strenghten the requirements, you could have your interface declare relevant methods, e.g. in your case, something in the lines of public String getHTTPMethodType();.
That would force implementing enums to provide an implementation for that method.
This setting coupled with adequate API documentation should help adding functionality in a relatively controlled way.
Self-contained example (don't mind the lazy names here)
package test;
import java.util.ArrayList;
import java.util.List;
public class Main {
public static void main(String[] args) {
List<HTTPMethodConvertible> blah = new ArrayList<>();
blah.add(LibraryEnum.FIRST);
blah.add(ClientEnum.BLABLABLA);
for (HTTPMethodConvertible element: blah) {
System.out.println(element.getHTTPMethodType());
}
}
static interface HTTPMethodConvertible {
public String getHTTPMethodType();
}
static enum LibraryEnum implements HTTPMethodConvertible {
FIRST("first"),
SECOND("second"),
THIRD("third");
String httpMethodType;
LibraryEnum(String s) {
httpMethodType = s;
}
public String getHTTPMethodType() {
return httpMethodType;
}
}
static enum ClientEnum implements HTTPMethodConvertible {
FOO("GET"),BAR("PUT"),BLAH("OPTIONS"),MEH("DELETE"),BLABLABLA("POST");
String httpMethodType;
ClientEnum(String s){
httpMethodType = s;
}
public String getHTTPMethodType() {
return httpMethodType;
}
}
}
Output
first
POST
Enums are not extensible. To solve your problem simply
turn the enum in a class
create constants for the predefined types
if you want a replacement for Enum.valueOf: track all instances of the class in a static map
For example:
public class MyType {
private static final HashMap<String,MyType> map = new HashMap<>();
private String name;
private String httpMethodType;
// replacement for Enum.valueOf
public static MyType valueOf(String name) {
return map.get(name);
}
public MyType(String name, String httpMethodType) {
this.name = name;
this.httpMethodType = httpMethodType;
map.put(name, this);
}
// accessors
public String name() { return name; }
public String httpMethodType() { return httpMethodType; }
// predefined constants
public static final MyType FIRST = new MyType("FIRST", "first");
public static final MyType SECOND = new MyType("SECOND", "second");
...
}
Think about Enum like a final class with static final instances of itself. Of course you cannot extend final class, but you can use non-final class with static final instances in your library. You can see example of this kind of definition in JDK. Class java.util.logging.Level can be extended with class containing additional set of logging levels.
If you accept this way of implementation, your library code example can be like:
public class EnumClass {
public static final EnumClass FIRST = new EnumClass("first");
public static final EnumClass SECOND = new EnumClass("second");
public static final EnumClass THIRD = new EnumClass("third");
private String httpMethodType;
protected EnumClass(String name){
this.httpMethodType = name;
}
}
Client application can extend list of static members with inheritance:
public final class ClientEnum extends EnumClass{
public static final ClientEnum CUSTOM_FIRST = new ClientEnum("custom_first");
public static final ClientEnum CUSTOM_SECOND = new ClientEnum("custom_second");
private ClientEnum(String name){
super(name);
}
}
I think that this solution is close to what you have asked, because all static instances are visible from client class, and all of them will satisfy your generic wildcard.
We Fixed enum inheritance issue this way, hope it helps
Our App has few classes and each has few child views(nested views), in order to be able to navigate between childViews and save the currentChildview we saved them as enum inside each Class.
but we had to copy paste, some common functionality like next, previous and etc inside each enum.
To avoid that we needed a BaseEnum, we used interface as our base enum:
public interface IBaseEnum {
IBaseEnum[] getList();
int getIndex();
class Utils{
public IBaseEnum next(IBaseEnum enumItem, boolean isCycling){
int index = enumItem.getIndex();
IBaseEnum[] list = enumItem.getList();
if (index + 1 < list.length) {
return list[index + 1];
} else if(isCycling)
return list[0];
else
return null;
}
public IBaseEnum previous(IBaseEnum enumItem, boolean isCycling) {
int index = enumItem.getIndex();
IBaseEnum[] list = enumItem.getList();
IBaseEnum previous;
if (index - 1 >= 0) {
previous = list[index - 1];
}
else {
if (isCycling)
previous = list[list.length - 1];
else
previous = null;
}
return previous;
}
}
}
and this is how we used it
enum ColorEnum implements IBaseEnum {
RED,
YELLOW,
BLUE;
#Override
public IBaseEnum[] getList() {
return values();
}
#Override
public int getIndex() {
return ordinal();
}
public ColorEnum getNext(){
return (ColorEnum) new Utils().next(this,false);
}
public ColorEnum getPrevious(){
return (ColorEnum) new Utils().previous(this,false);
}
}
you could add getNext /getPrevious to the interface too
#wero's answer is very good but has some problems:
the new MyType("FIRST", "first"); will be called before map = new HashMap<>();. in other words, the map will be null when map.add() is called. unfortunately, the occurring error will be NoClassDefFound and it doesn't help to find the problem. check this:
public class Subject {
// predefined constants
public static final Subject FIRST;
public static final Subject SECOND;
private static final HashMap<String, Subject> map;
static {
map = new HashMap<>();
FIRST = new Subject("FIRST");
SECOND = new Subject("SECOND");
}
private final String name;
public Subject(String name) {
this.name = name;
map.put(name, this);
}
// replacement for Enum.valueOf
public static Subject valueOf(String name) {
return map.get(name);
}
// accessors
public String name() {
return name;
}
I am using Tapestry 5.3.6 for a web application and I want the user to edit an instance of a Java class (a "bean", or POJO) using a web form (which immediately suggests the use of beaneditform) - however the Java class to be edited has a fairly complex structure. I am looking for the simplest way of doing this in Tapestry 5.
Firstly, lets define some utility classes e.g.
public class ModelObject {
private URI uri;
private boolean modified;
// the usual constructors, getters and setters ...
}
public class Literal<T> extends ModelObject {
private Class<?> valueClass;
private T value;
public Literal(Class<?> valueClass) {
this.valueClass = valueClass;
}
public Literal(Class<?> valueClass, T value) {
this.valueClass = valueClass;
this.value = value;
}
// the usual getters and setters ...
}
public class Link<T extends ModelObject> extends ModelObject {
private Class<?> targetClass;
private T target;
public Link(Class<?> targetClass) {
this.targetClass = targetClass;
}
public Link(Class<?> targetClass, T target) {
this.targetClass = targetClass;
this.target = target;
}
// the usual getters and setters ...
}
Now you can create some fairly complex data structures, for example:
public class HumanBeing extends ModelObject {
private Literal<String> name;
// ... other stuff
public HumanBeing() {
name = new Literal<String>(String.class);
}
// the usual getters and setters ...
}
public class Project extends ModelObject {
private Literal<String> projectName;
private Literal<Date> startDate;
private Literal<Date> endDate;
private Literal<Integer> someCounter;
private Link<HumanBeing> projectLeader;
private Link<HumanBeing> projectManager;
// ... other stuff, including lists of things, that may be Literals or
// Links ... e.g. (ModelObjectList is an enhanced ArrayList that remembers
// the type(s) of the objects it contains - to get around type erasure ...
private ModelObjectList<Link<HumanBeing>> projectMembers;
private ModelObjectList<Link<Project>> relatedProjects;
private ModelObjectList<Literal<String>> projectAliases;
// the usual constructors, getters and setters for all of the above ...
public Project() {
projectName = new Literal<String>(String.class);
startDate = new Literal<Date>(Date.class);
endDate = new Literal<Date>(Date.class);
someCounter = new Literal<Integer>(Integer.class);
projectLeader = new Link<HumanBeing>(HumanBeing.class);
projectManager = new Link<HumanBeing>(HumanBeing.class);
projectMembers = new ModelObjectList<Link<HumanBeing>>(Link.class, HumanBeing.class);
// ... more ...
}
}
If you point beaneditform at an instance of Project.class, you will not get very far before you have to supply a lot of custom coercers, translators, valueencoders, etc - and then you still run into the problem that you can't use generics when "contributing" said coercers, translators, valueencoders, etc.
I then started writing my own components to get around these problems (e.g. ModelObjectDisplay and ModelObjectEdit) but this would require me to understand a lot more of the guts of Tapestry than I have time to learn ... it feels like I might be able to do what I want using the standard components and liberal use of "delegate" etc. Can anyone see a simple path for me to take with this?
Thanks for reading this far.
PS: if you are wondering why I have done things like this, it is because the model represents linked data from an RDF graph database (aka triple-store) - I need to remember the URI of every bit of data and how it relates (links) to other bits of data (you are welcome to suggest better ways of doing this too :-)
EDIT:
#uklance suggested using display and edit blocks - here is what I had already tried:
Firstly, I had the following in AppPropertyDisplayBlocks.tml ...
<t:block id="literal">
<t:delegate to="literalType" t:value="literalValue" />
</t:block>
<t:block id="link">
<t:delegate to="linkType" t:value="linkValue" />
</t:block>
and in AppPropertyDisplayBlocks.java ...
public Block getLiteralType() {
Literal<?> literal = (Literal<?>) context.getPropertyValue();
Class<?> valueClass = literal.getValueClass();
if (!AppModule.modelTypes.containsKey(valueClass))
return null;
String blockId = AppModule.modelTypes.get(valueClass);
return resources.getBlock(blockId);
}
public Object getLiteralValue() {
Literal<?> literal = (Literal<?>) context.getPropertyValue();
return literal.getValue();
}
public Block getLinkType() {
Link<?> link = (Link<?>) context.getPropertyValue();
Class<?> targetClass = link.getTargetClass();
if (!AppModule.modelTypes.containsKey(targetClass))
return null;
String blockId = AppModule.modelTypes.get(targetClass);
return resources.getBlock(blockId);
}
public Object getLinkValue() {
Link<?> link = (Link<?>) context.getPropertyValue();
return link.getTarget();
}
AppModule.modelTypes is a map from java class to a String to be used by Tapestry e.g. Link.class -> "link" and Literal.class -> "literal" ... in AppModule I had the following code ...
public static void contributeDefaultDataTypeAnalyzer(
MappedConfiguration<Class<?>, String> configuration) {
for (Class<?> type : modelTypes.keySet()) {
String name = modelTypes.get(type);
configuration.add(type, name);
}
}
public static void contributeBeanBlockSource(
Configuration<BeanBlockContribution> configuration) {
// using HashSet removes duplicates ...
for (String name : new HashSet<String>(modelTypes.values())) {
configuration.add(new DisplayBlockContribution(name,
"blocks/AppPropertyDisplayBlocks", name));
configuration.add(new EditBlockContribution(name,
"blocks/AppPropertyEditBlocks", name));
}
}
I had similar code for the edit blocks ... however none of this seemed to work - I think because the original object was passed to the "delegate" rather than the de-referenced object which was either the value stored in the literal or the object the link pointed to (hmm... should be [Ll]inkTarget in the above, not [Ll]inkValue). I also kept running into errors where Tapestry couldn't find a suitable "translator", "valueencoder" or "coercer" ... I am under some time pressure so it is difficult to follow these twisty passages through in order to get out of the maze :-)
I would suggest to build a thin wrapper around the Objects you would like to edit though the BeanEditForm and pass those into it. So something like:
public class TapestryProject {
private Project project;
public TapestryProject(Project proj){
this.project = proj;
}
public String getName(){
this.project.getProjectName().getValue();
}
public void setName(String name){
this.project.getProjectName().setValue(name);
}
etc...
}
This way tapestry will deal with all the types it knows about leaving you free of having to create your own coersions (which is quite simple in itself by the way).
You can contribute blocks to display and edit your "link" and "literal" datatypes.
The beaneditform, beaneditor and beandisplay are backed by the BeanBlockSource service. BeanBlockSource is responsible for providing display and edit blocks for various datatypes.
If you download the tapestry source code and have a look at the following files:
tapestry-core\src\main\java\org\apache\tapestry5\corelib\pages\PropertyEditBlocks.java
tapestry-core\src\main\resources\org\apache\tapestry5\corelib\pages\PropertyEditBlocks.tml
tapestry-core\src\main\java\org\apache\tapestry5\services\TapestryModule.java
You will see how tapestry contributes EditBlockContribution and DisplayBlockContribution to provide default blocks (eg for a "date" datatype).
If you contribute to BeanBlockSource, you could provide display and edit blocks for your custom datatypes. This will require you reference blocks by id in a page. The page can be hidden from your users by annotating it with #WhitelistAccessOnly.
http://tapestry.apache.org/current/apidocs/org/apache/tapestry5/services/BeanBlockSource.html
http://tapestry.apache.org/current/apidocs/org/apache/tapestry5/services/DisplayBlockContribution.html
http://tapestry.apache.org/current/apidocs/org/apache/tapestry5/services/EditBlockContribution.html
http://tapestry.apache.org/current/apidocs/org/apache/tapestry5/annotations/WhitelistAccessOnly.html
Here's an example of using an interface and a proxy to hide the implementation details from your model. Note how the proxy takes care of updating the modified flag and is able to map URI's from the Literal array to properties in the HumanBeing interface.
package com.github.uklance.triplestore;
import java.lang.reflect.InvocationHandler;
import java.lang.reflect.Method;
import java.lang.reflect.Proxy;
import java.util.Arrays;
import java.util.HashMap;
import java.util.Map;
import org.junit.Test;
public class TripleStoreOrmTest {
public static class Literal<T> {
public String uri;
public boolean modified;
public Class<T> type;
public T value;
public Literal(String uri, Class<T> type, T value) {
super();
this.uri = uri;
this.type = type;
this.value = value;
}
#Override
public String toString() {
return "Literal [uri=" + uri + ", type=" + type + ", value=" + value + ", modified=" + modified + "]";
}
}
public interface HumanBeing {
public String getName();
public void setName(String name);
public int getAge();
public void setAge();
}
public interface TripleStoreProxy {
public Map<String, Literal<?>> getLiteralMap();
}
#Test
public void testMockTripleStore() {
Literal<?>[] literals = {
new Literal<String>("http://humanBeing/1/Name", String.class, "Henry"),
new Literal<Integer>("http://humanBeing/1/Age", Integer.class, 21)
};
System.out.println("Before " + Arrays.asList(literals));
HumanBeing humanBeingProxy = createProxy(literals, HumanBeing.class);
System.out.println("Before Name: " + humanBeingProxy.getName());
System.out.println("Before Age: " + humanBeingProxy.getAge());
humanBeingProxy.setName("Adam");
System.out.println("After Name: " + humanBeingProxy.getName());
System.out.println("After Age: " + humanBeingProxy.getAge());
Map<String, Literal<?>> literalMap = ((TripleStoreProxy) humanBeingProxy).getLiteralMap();
System.out.println("After " + literalMap);
}
protected <T> T createProxy(Literal<?>[] literals, Class<T> type) {
Class<?>[] proxyInterfaces = { type, TripleStoreProxy.class };
final Map<String, Literal> literalMap = new HashMap<String, Literal>();
for (Literal<?> literal : literals) {
String name = literal.uri.substring(literal.uri.lastIndexOf("/") + 1);
literalMap.put(name, literal);
}
InvocationHandler handler = new InvocationHandler() {
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
if (method.getDeclaringClass().equals(TripleStoreProxy.class)) {
return literalMap;
}
if (method.getName().startsWith("get")) {
String name = method.getName().substring(3);
return literalMap.get(name).value;
} else if (method.getName().startsWith("set")) {
String name = method.getName().substring(3);
Literal<Object> literal = literalMap.get(name);
literal.value = args[0];
literal.modified = true;
}
return null;
}
};
return type.cast(Proxy.newProxyInstance(getClass().getClassLoader(), proxyInterfaces, handler));
}
}