Serialization in the Hazelcast IMap - java

I'm trying to load data into IMap inside jet pipeLine stage I'm getting the error
Here is my code
public static Pipeline pipeLineStage(JetInstance jet) {
Pipeline pipeLine = Pipeline.create();
BatchStage<DataModel> dbValue = pipeLine.readFrom(Sources.jdbc(
"jdbc:postgresql://localhost/postgres?user=postgres&password=root",
"SELECT id1, id2, id3, id4\r\n"
+ " FROM public.tbl_test where id1='3'",
resultSet -> new DataModel(resultSet.getString(2), resultSet.getString(3), resultSet.getString(4))));
dbValue.filter(model -> model.getId2().equals("person"))
.map(model -> JsonUtil.mapFrom(model.getObject_value())).map(map -> {
IMap<Object, Object> map1 = jet.getMap("map1");
map1.put("employee_id", map.get("id"));
return map;
}).writeTo(Sinks.logger());
return pipeLine;
}
Error:-
Exception in thread "main" java.lang.IllegalArgumentException: "mapFn" must be serializable
at com.hazelcast.jet.impl.util.Util.checkSerializable(Util.java:203)
*If I store the data in a normal Map I'm not getting any error and getting error only if I store in IMap Object and In the above code I'm using model class i,e DataModel and that implements public class DataModel implements Serializable {}..... Any suggestions would also be helpful.. Thanks *

It appears that you have configured a serialization factory by the name of mapFn which requires serialization. Simply add implements Serializable to the class definition.

Related

"Optional group value" Exception while writing Map<String, Set<String>> to parquet in Spark-Java

I have a POJO model as follows -
public class Model implements Serializable {
private Map<String,Set<String>> map;
}
which I am trying to write to a parquet file using Spark. The code for the same is follows:
JavaRDD<Context> dataSet = generate();
JavaPairRDD<Model, Model> outputRDD = dataSet.mapToPair((PairFunction<Context, Model, Model>)
context -> {
return new Tuple2<>(context.getMoel1(), context.getModel2());
});
Dataset<Tuple2<Model, Model>> outputDS = sqlContext.createDataset(JavaPairRDD.toRDD(outputRDD),
Encoders.tuple(Encoders.bean(Model.class), Encoders.bean(Model.class)));
outputDS.coalesce(numPartitions).write().mode(SaveMode.Overwrite).parquet(outputPath + "v2/");
It gives me the following exception because of using a Set<> inside the map.
Caused by: org.apache.parquet.schema.InvalidSchemaException: Cannot write a schema with an empty group: optional group value {
}
at org.apache.parquet.schema.TypeUtil$1.visit(TypeUtil.java:27)
at org.apache.parquet.schema.GroupType.accept(GroupType.java:255)
at org.apache.parquet.schema.TypeUtil$1.visit(TypeUtil.java:31)
at org.apache.parquet.schema.GroupType.accept(GroupType.java:255)
at org.apache.parquet.schema.TypeUtil$1.visit(TypeUtil.java:31)
So, I tried making it a Map<String, List<String>> and it worked fine.
However, since this model is used all over the code base, there will be many repurcussions for changing it.
Why is this happening? And how to resolve this?
Thanks in advance!
P.S. I am using Spark 3.1.2

How to pass parameter to a namedQuery from a camel route?

I have an application running camel on spring-boot.
I want to pass a parameter retriesAllowed to a namedQuery from a camel route.
namedQuery:
#NamedQuery(name = "findFailedMessages", query = RawMessageTx.HQL_FIND_FAILED_MESSAGES)
public class RawMessageTx extends BaseEntityWithTransmitStatus implements Serializable {
public static final String HQL_FIND_FAILED_MESSAGES = "SELECT x from RawMessageTx x WHERE x.status = 'FAILED' and x.tryAgain = 1 and x.retriesAttempted <= :retriesAllowed ORDER BY x.created ";
Route:
from("seda:retry_poll_failed_messages").routeId("retry_poll_failed_messages")
.setHeader("retriesAllowed", constant(retriesAllowed))
.toF("jpa:%s?namedQuery=findFailedMessages&maximumResults=%d", RawMessageTx.class.getName(),
maxMessagesPerPollAttempt)
.split(body())
.to("direct:anotherEndpoint");
I tried different things and it does not work and I can't find a good example online on this.
This is explained in the doc of JPA component:
parameters: This key/value mapping is used for building the query
parameters. It is expected to be of the generic type java.util.Map
where the keys are the named parameters of a given JPA query
This means that you have to do something like:
to("jpa:...?parameters=#myMap")
Where "myMap" is the name of a bean of type 'java.util.Map' that is available in Camel registry.
One possible way (among others) to register such bean could be a Camel Processor (to invoke before the jpa endpoint of course)
public void process(Exchange exchange) throws Exception {
Map<String, Object> params = Map.of("status", "demo");
exchange.getContext().getRegistry().bind("myMap", params);
}

How do I get QueryFields from custom class when defining cache?

I'm trying to get Ignite Cache data through jdbc. For that purpose I define new custom class and annotate fields like that :
public class MyClass implements Serializable {
#QuerySqlField(index = true)
public Integer id;
#QuerySqlField(index = true)
public String records_offset;
#QuerySqlField(index = true)
public Integer session_id;
...
}
Then I start ignite in this way:
CacheConfiguration conf = new CacheConfiguration();
conf.setBackups(1);
conf.setName("test");
QueryEntity queryEntity = new QueryEntity();
queryEntity.setKeyType(Integer.class.getName());
queryEntity.setValueType(CDR.class.getName());
queryEntity.setTableName("CDR");
conf.setQueryEntities(Arrays.asList(queryEntity));
IgniteConfiguration iconf = new IgniteConfiguration();
iconf.setCacheConfiguration(conf);
iconf.setPeerClassLoadingEnabled(true);
this.ignite = Ignition.start(iconf);
this.cache = ignite.getOrCreateCache("test");
Now when I try to get data from JDBC, I get error:
Error: class org.apache.ignite.binary.BinaryObjectException: Custom objects are not supported (state=50000,code=0)
I could define a set of fields to get opportunity to fetch data from JDBC
LinkedHashMap<String, String> fields = new LinkedHashMap();
fields.put("session_id", Integer.class.getName());
fields.put("records_offset", String.class.getName());
queryEntity.setFields(fields);
But Why do I need to do this if I've already annotated field in class definition?
You have three options to define SQL schema:
Annotations and CacheConfiguration.setIndexedTypes
https://apacheignite.readme.io/docs/cache-queries#section-query-configuration-by-annotations
You can configure QueryEntity:
https://apacheignite.readme.io/docs/cache-queries#section-query-configuration-using-queryentity
or just use pure SQL:
https://apacheignite-sql.readme.io/docs/create-table
In your case, you mixed [1] and [2], so you registered key and value for indexing by QueryEntity, but defined fields with annotations, so mixing of different ways doesn't work. you need to stick to open specific way like you already did by adding key and value registration for indexing with CacheConfiguration.setIndexedTypes method. So you can get rid of QueryEntity now.

How to save and query dynamic fields in Spring Data MongoDB?

I'm on Spring boot 1.4.x branch and Spring Data MongoDB.
I want to extend a Pojo from HashMap to give it the possibility to save new properties dynamically.
I know I can create a Map<String, Object> properties in the Entry class to save inside it my dynamics values but I don't want to have an inner structure. My goal is to have all fields at the root's entry class to serialize it like that:
{
"id":"12334234234",
"dynamicField1": "dynamicValue1",
"dynamicField2": "dynamicValue2"
}
So I created this Entry class:
#Document
public class Entry extends HashMap<String, Object> {
#Id
private String id;
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
}
And the repository like this:
public interface EntryRepository extends MongoRepository<Entry, String> {
}
When I launch my app I have this error:
Error creating bean with name 'entryRepository': Invocation of init method failed; nested exception is org.springframework.data.mapping.model.MappingException: Could not lookup mapping metadata for domain class java.util.HashMap!
Any idea?
TL; DR;
Do not use Java collection/map types as a base class for your entities.
Repositories are not the right tool for your requirement.
Use DBObject with MongoTemplate if you need dynamic top-level properties.
Explanation
Spring Data Repositories are repositories in the DDD sense acting as persistence gateway for your well-defined aggregates. They inspect domain classes to derive the appropriate queries. Spring Data excludes collection and map types from entity analysis, and that's why extending your entity from a Map fails.
Repository query methods for dynamic properties are possible, but it's not the primary use case. You would have to use SpEL queries to express your query:
public interface EntryRepository extends MongoRepository<Entry, String> {
#Query("{ ?0 : ?1 }")
Entry findByDynamicField(String field, Object value);
}
This method does not give you any type safety regarding the predicate value and only an ugly alias for a proper, individual query.
Rather use DBObject with MongoTemplate and its query methods directly:
List<DBObject> result = template.find(new Query(Criteria.where("your_dynamic_field")
.is(theQueryValue)), DBObject.class);
DBObject is a Map that gives you full access to properties without enforcing a pre-defined structure. You can create, read, update and delete DBObjects objects via the Template API.
A last thing
You can declare dynamic properties on a nested level using a Map, if your aggregate root declares some static properties:
#Document
public class Data {
#Id
private String id;
private Map<String, Object> details;
}
Here we can achieve using JSONObject
The entity will be like this
#Document
public class Data {
#Id
private String id;
private JSONObject details;
//getters and setters
}
The POJO will be like this
public class DataDTO {
private String id;
private JSONObject details;
//getters and setters
}
In service
Data formData = new Data();
JSONObject details = dataDTO.getDetails();
details.put("dynamicField1", "dynamicValue1");
details.put("dynamicField2", "dynamicValue2");
formData.setDetails(details);
mongoTemplate.save(formData );
i have done as per my business,refer this code and do it yours. Is this helpful?

Map with Long key does not work in Serializable class

I have a Serializable class, with a map property. When the map has a Long as key the code does not work, while with String it works.
This doesn't work:
public class UserSession implements Serializable {
Map<Long, Date> timeQuestionAsked = new HashMap<>();
}
This does work:
public class UserSession implements Serializable {
Map<String, Date> timeQuestionAsked = new HashMap<>();
}
The weird thing I get no exception. This class is loaded in a filter in Jetty (google app engine app), and when I try to use the class with the Long key, I get a weird "Not found" error.
Actually it was caused by the database framework I was using: objectify. It turns out Maps must have string as keys: https://code.google.com/p/objectify-appengine/wiki/Entities#Maps
It has nothing to do with Serializable...

Categories

Resources