Enum's ability to implement part of the business logic - java

Trying to refactor the code. Now the code is:
if ("objects".equals(type)) {
Object oldJson = oldData.get("content");
Object newJson = newData.get("content");
} else if ("objects.appeals".equals(type)) {
Object oldJson = oldData.get("data").get("person");
Object newJson = newData.get("data").get("person");
}
The number of types is much larger. I gave only 2 for an example. Trying to optimize with enum:
public enum HistoryUpdateTypeEnum {
OBJECTS("objects", new Document()),
APPEALS_OBJECTS("appeals.objects", new Document());
HistoryUpdateTypeEnum(String type, Document documentSlice) {
this.type = type;
this.documentSlice = documentSlice;
}
private String type;
private Document documentSlice;
public static HistoryUpdateTypeEnum fromString(String value) {
return Stream.of(values())
.filter(Objects::nonNull)
.filter(v -> v.name().replaceAll("_",".").equalsIgnoreCase(value))
.findAny()
.orElse(null);
}
public Object formSlice(Document data) {
this.documentSlice = data;
return documentSlice.get("content"); // How to make it universal?
}
}
And use:
HistoryUpdateTypeEnum typeEnum = HistoryUpdateTypeEnum.fromString("objects.appeals");
Document oldData = new Document(......).append(..., ...);
Document newData = new Document(......).append(..., ...);
Object oldJson = typeEnum.formSlice(oldData);
Object newJson = typeEnum.formSlice(newData);
I can’t figure out how to make me perform my action for each type. That is, documentSlice.get ("content") for 'objects' or documentSlice.get("data").get("person") for 'appeals.objects'. Are there any ideas?

One of the possible variants is abstract method in your Enum class:
public enum HistoryUpdateTypeEnum {
OBJECTS {
#Override
Object getJson(Document data) {
return data.get("objects");
}
},
...
abstract Object getJson(Document data);
}
Then you could use it in such way:
HistoryUpdateTypeEnum history = HistoryUpdateTypeEnum .valueOf(type.toUpperCase());
Object oldJson = history.getJson(oldData);
Object newJson = history.getJson(newData);

Related

Kafka Connect. How to handle List of custom object, when specifying schema and building SourceRecord value

I have dto CryptoNews. Which contains
List<Currencies> currencies
I would like to save "currencies" field to SourceRecord when constructing it.
Can't figure out how to:
Declare it in schema.
Pass it to Struct object when building value.
My attempts end in this exception:
Invalid Java object for schema type STRUCT: class com.dto.Currencies
Kafka Connect doesn't provide explicit example how to do handle case, when object in List requires it's own Schema.
I also tried to apply similar approach as in Kafka test cases, but it doesn't work. https://github.com/apache/kafka/blob/trunk/connect/api/src/test/java/org/apache/kafka/connect/data/StructTest.java#L95-L98
How to do this?
kafka-connect-api version: 0.10.2.0-cp1
value and key converter: org.apache.kafka.connect.json.JsonConverter
no avro used
CryptoNews implements Serializable {
// omitted fields
private List<Currencies> currencies;
}
class Currencies {
private String code;
private String title;
private String slug;
private String url;
}
SchemaConfiguration
public static final Integer FIRST_VERSION = 1;
public static final String CURRENCIES_SCHEMA_NAME = "currencies";
public static final Schema NEWS_SCHEMA = SchemaBuilder.struct().name("News")
.version(FIRST_VERSION)
.field(CURRENCIES_SCHEMA_NAME, CURRENCIES_SCHEMA)
// simple fields ommited for brevity.
.build();
public static final Schema CURRENCIES_SCHEMA = SchemaBuilder.array(
SchemaBuilder.struct()
.field(CODE_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.field(TITLE_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.field(SLUG_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.field(URL_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.optional()
.build()
)
.optional()
.name(CURRENCIES_SCHEMA_NAME)
.version(FIRST_VERSION)
.build();
SourceTask
return new SourceRecord(
sourcePartition(),
sourceOffset(cryptoNews),
config.getString(TOPIC_CONFIG),
null,
CryptoNewsSchema.NEWS_KEY_SCHEMA,
buildRecordKey(cryptoNews),
CryptoNewsSchema.NEWS_SCHEMA,
buildRecordValue(cryptoNews),
Instant.now().toEpochMilli()
);
public Struct buildRecordValue(CryptoNews cryptoNews){
Struct valueStruct = new Struct(CryptoNewsSchema.NEWS_SCHEMA);
// Produces Invalid Java object for schema type STRUCT: class com.dto.Currencies
List<Currencies> currencies = cryptoNews.getCurrencies();
if (currencies != null) {
valueStruct.put(CurrenciesSchema.CURRENCIES_SCHEMA_NAME, currencies);
}
return valueStruct;
}
UPDATE:
worker.properties
bootstrap.servers=localhost:29092
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=true
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=true
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter.schemas.enable=true
rest.port=8086
rest.host.name=127.0.0.1
offset.storage.file.filename=offsets/standalone.offsets
offset.flush.interval.ms=10000
You need to provide a List<Struct>
Here's a full unit test example
First, an interface that will help
public interface ConnectPOJOConverter<T> {
Schema getSchema();
T fromConnectData(Struct s);
Struct toConnectData(T t);
}
class ArrayStructTest {
public static final Schema CURRENCY_ITEM_SCHEMA = SchemaBuilder.struct()
.version(1)
.name(Currency.class.getName())
.doc("A currency item")
.field("code", Schema.OPTIONAL_STRING_SCHEMA)
.field("title", Schema.OPTIONAL_STRING_SCHEMA)
.field("slug", Schema.OPTIONAL_STRING_SCHEMA)
.field("url", Schema.OPTIONAL_STRING_SCHEMA)
.build();
static final ConnectPOJOConverter<Currency> CONVERTER = new CurrencyConverter();
#Test
void myTest() {
// Given
List<Currency> currencies = new ArrayList<>();
// TODO: Get from external source
currencies.add(new Currency("200", "Hello", "/slug", "http://localhost"));
currencies.add(new Currency("200", "World", "/slug", "http://localhost"));
// When: build Connect Struct data
Schema valueSchema = SchemaBuilder.struct()
.name("CryptoNews")
.doc("A record holding a list of currency items")
.version(1)
.field("currencies", SchemaBuilder.array(CURRENCY_ITEM_SCHEMA).required().build())
.build();
final List<Struct> items = currencies.stream()
.map(CONVERTER::toConnectData)
.collect(Collectors.toList());
// In the SourceTask, this is what goes into the SourceRecord along with the valueSchema
Struct value = new Struct(valueSchema);
value.put("currencies", items);
// Then
assertDoesNotThrow(value::validate);
Object itemsFromStruct = value.get("currencies");
assertInstanceOf(List.class, itemsFromStruct);
//noinspection unchecked
List<Object> data = (List<Object>) itemsFromStruct; // could also use List<Struct>
assertEquals(2, data.size(), "same size");
assertInstanceOf(Struct.class, data.get(0), "Object list still has type information");
Struct firstStruct = (Struct) data.get(0);
assertEquals("Hello", firstStruct.get("title"));
currencies = data.stream()
.map(o -> (Struct) o)
.map(CONVERTER::fromConnectData)
.filter(Objects::nonNull) // in case converter has errors, could return null
.collect(Collectors.toList());
assertTrue(currencies.size() <= data.size());
assertEquals("World", currencies.get(1).getTitle(), "struct parsing data worked");
}
static class CurrencyConverter implements ConnectPOJOConverter<Currency> {
#Override
public Schema getSchema() {
return CURRENCY_ITEM_SCHEMA;
}
#Override
public Currency fromConnectData(Struct s) {
// simple conversion, but more complex types could throw errors
return new Currency(
s.getString("code"),
s.getString("title"),
s.getString("url"),
s.getString("slug")
);
}
#Override
public Struct toConnectData(Currency c) {
Struct s = new Struct(getSchema());
s.put("code", c.getCode());
s.put("title", c.getTitle());
s.put("url", c.getUrl());
s.put("slug", c.getSlug());
return s;
}
}
}
The alternative approach is to just use a String schema, and use Jackson ObjectMapper to get a JSON string, then let JSONConverter handle the rest.
final ObjectMapper om = new ObjectMapper();
final Schema valueSchema = Schema.STRING_SCHEMA;
output.put("schema", new TextNode("TODO")); // replace with JSONConverter schema
// for-each currency
Map<String, JsonNode> output = new HashMap<>();
try {
output.put("payload", om.readTree(om.writeValueAsBytes(currency))); // write and parse to not double-encode
String value = om.writeValueAsString(output);
SourceRecord r = new SourceRecord(...., valueSchema, value);
records.add(r); // poll return result
} catch (IOException e) {
// TODO: handle
}
// end for-each
return records;

Flink Collector issue when Collection Object with Map of Object class

I am facing a issue where when i collecting object from flink flatmap collector than i am not getting value collected correctly. I am getting object reference and its not giving me actual value.
dataStream.filter(new FilterFunction<GenericRecord>() {
#Override
public boolean filter(GenericRecord record) throws Exception {
if (record.get("user_id") != null) {
return true;
}
return false;
}
}).flatMap(new ProfileEventAggregateFlatMapFunction(aggConfig))
.map(new MapFunction<ProfileEventAggregateEmittedTuple, String>() {
#Override
public String map(
ProfileEventAggregateEmittedTuple profileEventAggregateEmittedTupleNew)
throws Exception {
String res=null;
try {
ObjectMapper mapper = new ObjectMapper();
mapper.setVisibility(PropertyAccessor.FIELD, Visibility.ANY);
res= mapper.writeValueAsString(profileEventAggregateEmittedTupleNew);
} catch (Exception e) {
e.printStackTrace();
}
return res;
}
}).print();
public class ProfileEventAggregateFlatMapFunction extends
RichFlatMapFunction<GenericRecord, ProfileEventAggregateEmittedTuple> {
private final ProfileEventAggregateTupleEmitter aggregator;
ObjectMapper mapper = ObjectMapperPool.getInstance().get();
public ProfileEventAggregateFlatMapFunction(String config) throws IOException {
this.aggregator = new ProfileEventAggregateTupleEmitter(config);
}
#Override
public void flatMap(GenericRecord event,
Collector<ProfileEventAggregateEmittedTuple> collector) throws Exception {
try {
List<ProfileEventAggregateEmittedTuple> aggregateTuples = aggregator.runAggregates(event);
for (ProfileEventAggregateEmittedTuple tuple : aggregateTuples) {
collector.collect(tuple);
}
}}
Debug Results:
tuple that i am collecting in collector
tuple = {ProfileEventAggregateEmittedTuple#7880}
profileType = "userprofile"
key = "1152473"
businessType = "keyless"
name = "consumer"
aggregates = {ArrayList#7886} size = 1
0 = {ProfileEventAggregate#7888} "geo_id {geo_id=1} {keyless_select_destination_cnt=1, total_estimated_distance=12.5}"
entityType = "geo_id"
dimension = {LinkedHashMap#7891} size = 1
"geo_id" -> {Integer#7897} 1
key = "geo_id"
value = {Integer#7897} 1
metrics = {LinkedHashMap#7892} size = 2
"keyless_select_destination_cnt" -> {Long#7773} 1
key = "keyless_select_destination_cnt"
value = {Long#7773} 1
"total_estimated_distance" -> {Double#7904} 12.5
key = "total_estimated_distance"
value = {Double#7904} 12.5
This i get in my map function .map(new MapFunction<ProfileEventAggregateEmittedTuple, String>()
profileEventAggregateEmittedTuple = {ProfileEventAggregateEmittedTuple#7935}
profileType = "userprofile"
key = "1152473"
businessType = "keyless"
name = "consumer"
aggregates = {GenericData$Array#7948} size = 1
0 = {ProfileEventAggregate#7950} "geo_id {geo_id=java.lang.Object#863dce2} {keyless_select_destination_cnt=java.lang.Object#7cdb4bfc, total_estimated_distance=java.lang.Object#52e81f57}"
entityType = "geo_id"
dimension = {HashMap#7952} size = 1
"geo_id" -> {Object#7957}
key = "geo_id"
value = {Object#7957}
Class has no fields
metrics = {HashMap#7953} size = 2
"keyless_select_destination_cnt" -> {Object#7962}
key = "keyless_select_destination_cnt"
value = {Object#7962}
Class has no fields
"total_estimated_distance" -> {Object#7963}
Please help me to understand what is happening why i am not getting correct data.
public class ProfileEventAggregateEmittedTuple implements Cloneable, Serializable {
private String profileType;
private String key;
private String businessType;
private String name;
private List<ProfileEventAggregate> aggregates = new ArrayList<ProfileEventAggregate>();
private long startTime;
private long endTime;
public String getProfileType() {
return profileType;
}
public void setProfileType(String profileType) {
this.profileType = profileType;
}
public String getKey() {
return key;
}
public void setKey(String key) {
this.key = key;
}
public String getBusinessType() {
return businessType;
}
public void setBusinessType(String businessType) {
this.businessType = businessType;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public List<ProfileEventAggregate> getAggregates() {
return aggregates;
}
public void addAggregate(ProfileEventAggregate aggregate) {
this.aggregates.add(aggregate);
}
public void setAggregates(List<ProfileEventAggregate> aggregates) {
this.aggregates = aggregates;
}
public long getStartTime() {
return startTime;
}
public void setStartTime(long startTime) {
this.startTime = startTime;
}
public long getEndTime() {
return endTime;
}
public void setEndTime(long endTime) {
this.endTime = endTime;
}
#Override
public ProfileEventAggregateEmittedTuple clone() {
ProfileEventAggregateEmittedTuple clone = new ProfileEventAggregateEmittedTuple();
clone.setProfileType(this.profileType);
clone.setKey(this.key);
clone.setBusinessType(this.businessType);
clone.setName(this.name);
for (ProfileEventAggregate aggregate : this.aggregates) {
clone.addAggregate(aggregate.clone());
}
return clone;
}
public class ProfileEventAggregate implements Cloneable, Serializable {
private String entityType;
private Map<String, Object> dimension =new LinkedHashMap<String, Object>();
private Map<String, Object> metrics = new LinkedHashMap<String, Object>();
public Map<String, Object> getDimension() {
return dimension;
}
public void setDimension(Map<String, Object> dimension) {
this.dimension.putAll(dimension);
}
public void addDimension(String dimensionKey, Object dimensionValue) {
this.dimension.put(dimensionKey, dimensionValue);
}
public Map<String, Object> getMetrics() {
return metrics;
}
public void addMetric(String metricKey, Object metricValue) {
this.metrics.put(metricKey, metricValue);
}
public void setMetrics(Map<String, Object> metrics) {
this.metrics.putAll(metrics);
}
public String getEntityType() {
return entityType;
}
public void setEntityType(String entityType) {
this.entityType = entityType;
}
#Override
public ProfileEventAggregate clone() {
ProfileEventAggregate clone = new ProfileEventAggregate();
clone.setEntityType(this.entityType);
clone.getDimension().putAll(this.getDimension());
clone.getMetrics().putAll(this.metrics);
return clone;
}
When you don't enableObjectReuse, objects are copied with your configured serializer (seems to be Avro?).
In your case, you use Map<String, Object> where you cannot infer a plausible schema.
The easiest fix would be to enableObjectReuse. Else make sure your serializer matches your data. So you could add a unit test where you use AvroSerializer#copy and make sure your POJO is properly annotated if you want to stick with Avro reflect or even better go with a schema first approach, where you generate your Java POJO with a Avro schema and use specific Avro.
Let's discuss some alternatives:
Use GenericRecord. Instead of converting it to a Java type, directly access GenericRecord. This is usually the only way when the full record is flexible (e.g. your job takes any input and writes it out to S3).
Denormalize schema. Instead of having some class Event { int id; Map<String, Object> data; } you would use class EventInformation { int id; String predicate; Object value; }. You would need to group all information for processing. However, you will run into the same type issues with Avro.
Use wide-schema. Looking at the previous approach, if the different predicates are known beforehand, then you can use that to craft a wide-schema class Event { int id; Long predicate1; Integer predicate2; ... String predicateN; } where all oft he entries are nullable and most of them are indeed null. To encode null is very cheap.
Ditch Avro. Avro is fully typed. You may want to use something more dynamic. Protobuf has Any to support arbitrary submessages.
Use Kryo. Kryo can serialize arbitrary object trees at the cost of being slower and having more overhead.
If you want to write the data, you also need to think about a solution where the type information is added for proper deserialization. For an example, check out this JSON question. But there are more ways to implement it.

Java 8 Composition, Currying shorthand

I have a stream of BusinessObjects, I need to set a value on each object, I want to use Stream.map but map takes a Function<T,R> and I have the target object, a discriminator value, and the new value. setNewValueInBusinessObjectExample shows what I want to do and setNewValueInBusinessObjectWithFun is what I need help with.
Agreed! I can just use setNewValueInBusinessObjectExample on the map but I want to see how the functional style looks like. Thanks
class BusinessObject {
String firstField;
String secondField;
}
class SomeDiscriminator {
String value;
}
BusinessObject setNewValueInBusinessObjectExample(BusinessObject businessObject,
SomeDiscriminator discriminator, String newValue) {
if(discriminator.value.equals("firstField")) {
businessObject.firstField = newValue;
} else {//secondField
businessObject.secondField = newValue;
}
return businessObject;
}
Function<BusinessObject, Function<SomeDiscriminator, Function<String, BusinessObject>>>
setNewValueInBusinessObjectWithFun = {
/* todo: using nested Function<T,R> */
}
If you are in doubt with the construction and usage of functional interfaces, I recommend you to expand the whole thing to anonymous classes where the structure becomes obvious.
I also noticed the whole flow uses three parameters, the same your setNewValueInBusinessObjectExample does. Thus move the body of the method to the innermost anonymous class.
Function<BusinessObject, Function<SomeDiscriminator, Function<String, BusinessObject>>> setNewValueInBusinessObjectWithFun =
new Function<BusinessObject, Function<SomeDiscriminator, Function<String, BusinessObject>>>() {
#Override
public Function<SomeDiscriminator, Function<String, BusinessObject>> apply(final BusinessObject businessObject) {
return new Function<SomeDiscriminator, Function<String, BusinessObject>>() {
#Override
public Function<String, BusinessObject> apply(final SomeDiscriminator someDiscriminator) {
return new Function<String, BusinessObject>() {
#Override
public BusinessObject apply(final String newValue) {
if (someDiscriminator.value.equals("firstField")) {
businessObject.firstField = newValue;
} else {//secondField
businessObject.secondField = newValue;
}
return businessObject;
}
};
}
};
}
};
Now, pack the whole thing to lambda expressions and see what happens:
Function<BusinessObject, Function<SomeDiscriminator, Function<String, BusinessObject>>> setNewValueInBusinessObjectWithFun =
businessObject -> someDiscriminator -> newValue -> {
if (someDiscriminator.value.equals("firstField")) {
businessObject.firstField = newValue;
} else {//secondField
businessObject.secondField = newValue;
}
return businessObject;
};
For sake of clarity, name the variables inside the lambdas expression correctly, otherwise you would not be able to work with them well. The usage is fairly simple (I moved setters to constructor for sake of brevity:
BusinessObject businessObject = new BusinessObject("oldValue");
setNewValueInBusinessObjectWithFun
.apply(businessObject) // apply to an object
.apply(new SomeDiscriminator("firstField")) // finds its field
.apply("newValue"); // sets a new value
However, I recommend you to define a custom #FunctionalInterface with more straightforward definition...
#FunctionalInterface
interface MyFunction<T, R, U> {
T apply(T t, R r, U u);
}
... and usage ...
MyFunction<BusinessObject, SomeDiscriminator, String> myFunction =
(businessObject, someDiscriminator, newValue) -> {
if (someDiscriminator.value.equals("firstField")) {
businessObject.firstField = newValue;
} else {
businessObject.secondField = newValue;
}
return businessObject;
};
BusinessObject businessObject = new BusinessObject("oldValue");
myFunction.apply(businessObject, new SomeDiscriminator("firstField"), "newValue");
Not entirely sure I get what you might need, but something like this?
static Function<BusinessObject, Function<SomeDiscriminator, Function<String, BusinessObject>>>
setNewValueInBusinessObjectWithFun =
x -> y -> z -> {
if ("firstfield".equals(y.value)) {
x.firstField = z;
} else {
x.secondField = z;
}
return x;
};
And usage would be:
BusinessObject bo = new BusinessObject();
bo.firstField = "test";
SomeDiscriminator sd = new SomeDiscriminator();
sd.value = "firstField";
bo = setNewValueInBusinessObjectWithFun.apply(bo).apply(sd).apply("value");
System.out.println(bo.firstField);
This doesn't need to be complex. If you have a Stream<BusinessObject>, then neither SomeDiscriminator nor the String value is a stream element.
So it's a Function<BusinessObject, BusinessObject> that you need to pass to map:
Stream<BusinessObject> businessObjectStream = null;
SomeDiscriminator discriminator = null; String newValue = "";
Function<BusinessObject, BusinessObject> mapper = businessObject -> {
if (discriminator.value.equals("firstField")) {
businessObject.firstField = newValue;
} else {//secondField
businessObject.secondField = newValue;
}
return businessObject;
};
Called with:
businessObjectStream.map(mapper);
Even if the discriminator.value.equals("firstField") logic were dependent on dynamic values in the stream object, you would have needed perhaps a Predicate<...> somewhere, but really not even a higher-order function that returns a function dynamically.

How to convert List<Object> into List<T>?

This is the code for getting retrofit response as an object. The below method is working fine but I need an one common function for performing the above functionality, i.e the class name may vary. (e.g) ticket, price, token, appointment like this:
processGETRequest(AppController.getApiHelper().searchTickets(from, to), new RetrofitListener() {
#Override
public void onSuccess(Object object) { }
#Override
public void onSuccess(List<Object> object) {
// Here I'm getting retrofit response as a object //
if (object != null) {
// Below method is working fine //
List<Ticket> ticketList = new ArrayList<>();
for (Object result : object) {
String json = new Gson().toJson(result);
Ticket model = new Gson().fromJson(json, Ticket.class);
ticketList.add(model);
}
// I need an one common function for performing above functionality
// i.e the Class name may vary.. (e.g) Ticket, Price, Token, Appointment like this.
}
}
#Override
public void onError(String error) {
Log.d("error: ", " " + error);
}
}, false);
The RetrofitListener interface is simply:
public interface RetrofitListener {
void onSuccess(Object object);
void onSuccess(List<Object> object);
void onError(String error);
}
You can use a static function similar to:
static <T> List<T> toList(List<Object> object, Class<T> desiredClass) {
List<T> transformedList = new ArrayList<>();
if (object != null) {
for (Object result : object) {
String json = new Gson().toJson(result);
T model = new Gson().fromJson(json, desiredClass);
transformedList.add(model);
}
}
return transformedList;
}
Basically you just need to ensure that you deliver the desired type (e.g. desiredClass) and use it in fromJson.
Sample usage:
List<Ticket> ticketList = toList(object, Ticket.class);
List<Price> priceList = toList(object, Price.class);
Note that by moving the object != null into the toList-method you do not need to care about what is passed to that method. You at least get an empty list in return.
<T> List<T> getList(Class<T> type, List<Object> object) {
return object.stream()
.map(result -> new Gson().toJson(result))
.map(new Gson().fromJson(json, type))
.collect(Collectors.toList());
}
List<Ticket> ticketList = getList(Ticket.class, object);
This will do what your for-loop did.
You can convert the snippet
if (object != null) {
List<Ticket> ticketList = new ArrayList<>();
for (Object result : object) {
String json = new Gson().toJson(result);
Ticket model = new Gson().fromJson(json, Ticket.class);
ticketList.add(model);
}
}
with generics like this
<T> void check(Class<T> type, List<Object> object) {
List<T> ticketList = new ArrayList<>();
for (Object result : object) {
String json = new Gson().toJson(result);
T model = new Gson().fromJson(json, type);
ticketList.add(model);
}
}
As according to your question you want a generic code to get List which can be of Any? type.
List<Ticket> ticketList = new ArrayList<>();
for (Object result : object) {
String json = new Gson().toJson(result);
Ticket model = new Gson().fromJson(json, Ticket.class);
ticketList.add(model);
}
is this what you want?
List<Ticket> ticketList // can be List<T>??
Then create a generic method to get list and use it any where:
public <T> List<T> getObjectToList(Object obj, Class<T[]> ObjectArryaClass) {
String json = new Gson().toJson(obj);
return Arrays.asList(new Gson().fromJson(json, ObjectArryaClass));
}
call above method as:
#Override
public void onSuccess(List<Object> object) {
// Here I'm getting retrofit response as a object //
if (object != null) {
// Below method is working fine //
List<Ticket> ticketList = getObjectToList(object,Ticket[].class)
// I need an one common function for performing above functionality
// i.e the Class name may vary.. (e.g) Ticket, Price, Token, Appointment like this.
}
}

Suggestions on extending fit.RowFixture and fit.TypeAdapter so that I can bind/invoke on a class that keeps attrs in a map

TLDR: I'd like to know how to extend fit.TypeAdaptor so that I can invoke a method that expects parameters as default implementation of TypeAdaptor invokes the binded (bound ?) method by reflection and assumes it's a no-param method...
Longer version -
I'm using fit to build a test harness for my system (a service that returns a sorted list of custom objects). In order to verify the system, I thought I'd use fit.RowFixture to assert attributes of the list items.
Since RowFixture expects the data to be either a public attribute or a public method, I thought of using a wrapper over my custom object (say InstanceWrapper) - I also tried to implement the suggestion given in this previous thread about formatting data in RowFixture.
The trouble is that my custom object has around 41 attributes and I'd like to provide testers with the option of choosing which attributes they want to verify in this RowFixture. Plus, unless I dynamically add fields/methods to my InstanceWrapper class, how will RowFixture invoke either of my getters since both expect the attribute name to be passed as a param (code copied below) ?
I extended RowFixture to bind on my method but I'm not sure how to extend TypeAdaptor so that it invokes with the attr name..
Any suggestions ?
public class InstanceWrapper {
private Instance instance;
private Map<String, Object> attrs;
public int index;
public InstanceWrapper() {
super();
}
public InstanceWrapper(Instance instance) {
this.instance = instance;
init(); // initialise map
}
private void init() {
attrs = new HashMap<String, Object>();
String attrName;
for (AttrDef attrDef : instance.getModelDef().getAttrDefs()) {
attrName = attrDef.getAttrName();
attrs.put(attrName, instance.getChildScalar(attrName));
}
}
public String getAttribute(String attr) {
return attrs.get(attr).toString();
}
public String description(String attribute) {
return instance.getChildScalar(attribute).toString();
}
}
public class MyDisplayRules extends fit.RowFixture {
#Override
public Object[] query() {
List<Instance> list = PHEFixture.hierarchyList;
return convertInstances(list);
}
private Object[] convertInstances(List<Instance> instances) {
Object[] objects = new Object[instances.size()];
InstanceWrapper wrapper;
int index = 0;
for (Instance instance : instances) {
wrapper = new InstanceWrapper(instance);
wrapper.index = index;
objects[index++] = wrapper;
}
return objects;
}
#Override
public Class getTargetClass() {
return InstanceWrapper.class;
}
#Override
public Object parse(String s, Class type) throws Exception {
return super.parse(s, type);
}
#Override
protected void bind(Parse heads) {
columnBindings = new TypeAdapter[heads.size()];
for (int i = 0; heads != null; i++, heads = heads.more) {
String name = heads.text();
String suffix = "()";
try {
if (name.equals("")) {
columnBindings[i] = null;
} else if (name.endsWith(suffix)) {
columnBindings[i] = bindMethod("description", name.substring(0, name.length()
- suffix.length()));
} else {
columnBindings[i] = bindField(name);
}
} catch (Exception e) {
exception(heads, e);
}
}
}
protected TypeAdapter bindMethod(String name, String attribute) throws Exception {
Class partypes[] = new Class[1];
partypes[0] = String.class;
return PHETypeAdaptor.on(this, getTargetClass().getMethod("getAttribute", partypes), attribute);
}
}
For what it's worth, here's how I eventually worked around the problem:
I created a custom TypeAdapter (extending TypeAdapter) with the additional public attribute (String) attrName. Also:
#Override
public Object invoke() throws IllegalAccessException, InvocationTargetException {
if ("getAttribute".equals(method.getName())) {
Object params[] = { attrName };
return method.invoke(target, params);
} else {
return super.invoke();
}
}
Then I extended fit.RowFixture and made the following overrides:
public getTargetClass() - to return my class reference
protected TypeAdapter bindField(String name) throws Exception - this is a protected method in ColumnFixture which I modified so that it would use my class's getter method:
#Override
protected TypeAdapter bindField(String name) throws Exception {
String fieldName = camel(name);
// for all attributes, use method getAttribute(String)
Class methodParams[] = new Class[1];
methodParams[0] = String.class;
TypeAdapter a = TypeAdapter.on(this, getTargetClass().getMethod("getAttribute", methodParams));
PHETypeAdapter pheAdapter = new PHETypeAdapter(fieldName);
pheAdapter.target = a.target;
pheAdapter.fixture = a.fixture;
pheAdapter.field = a.field;
pheAdapter.method = a.method;
pheAdapter.type = a.type;
return pheAdapter;
}
I know this is not a neat solution, but it was the best I could come up with. Maybe I'll get some better solutions here :-)

Categories

Resources