Java MongoDB Pojo Custom Id Type - java

I make a class like this
#BsonDiscrimintor
public class User {
#BsonId
private Integer _id;
// some properties
// getter & setter
}
and register to codec
ClassModel<User> userModel = ClassModel.builder(User.class).enableDiscriminator(true).build();
PojoCodecProvider pojoCodecProvider = PojoCodecProvider.builder().register(userModel).build();
pojoCodecRegistry = fromRegistries(MongoClient.getDefaultCodecRegistry(), fromProviders(pojoCodecProvider));
mongoClient = new MongoClient("localhost", MongoClientOptions.builder().codecRegistry(pojoCodecRegistry).build());
mongoDatabase = mongoClient.getDatabase("bbs").withCodecRegistry(pojoCodecRegistry);
When I try to insert
public int addOne(User user) {
try {
user.set_id(Db.getNextId("user"));
userCollection.insertOne(user);
} catch (Exception e) {
e.printStackTrace();
}
return user.get_id();
}
But When I find it in mongo, its _id field type is ObjectID but not Int32.
But I declared _id as Integer, Why?

The default type of id in MongoDB is ObjectID. It could be changed as you create table. It's not decided by the class define in java.
Actually, MongoDB would implicitly convert int to ObjectID when your code run.

Related

Kafka Connect. How to handle List of custom object, when specifying schema and building SourceRecord value

I have dto CryptoNews. Which contains
List<Currencies> currencies
I would like to save "currencies" field to SourceRecord when constructing it.
Can't figure out how to:
Declare it in schema.
Pass it to Struct object when building value.
My attempts end in this exception:
Invalid Java object for schema type STRUCT: class com.dto.Currencies
Kafka Connect doesn't provide explicit example how to do handle case, when object in List requires it's own Schema.
I also tried to apply similar approach as in Kafka test cases, but it doesn't work. https://github.com/apache/kafka/blob/trunk/connect/api/src/test/java/org/apache/kafka/connect/data/StructTest.java#L95-L98
How to do this?
kafka-connect-api version: 0.10.2.0-cp1
value and key converter: org.apache.kafka.connect.json.JsonConverter
no avro used
CryptoNews implements Serializable {
// omitted fields
private List<Currencies> currencies;
}
class Currencies {
private String code;
private String title;
private String slug;
private String url;
}
SchemaConfiguration
public static final Integer FIRST_VERSION = 1;
public static final String CURRENCIES_SCHEMA_NAME = "currencies";
public static final Schema NEWS_SCHEMA = SchemaBuilder.struct().name("News")
.version(FIRST_VERSION)
.field(CURRENCIES_SCHEMA_NAME, CURRENCIES_SCHEMA)
// simple fields ommited for brevity.
.build();
public static final Schema CURRENCIES_SCHEMA = SchemaBuilder.array(
SchemaBuilder.struct()
.field(CODE_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.field(TITLE_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.field(SLUG_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.field(URL_FIELD, Schema.OPTIONAL_STRING_SCHEMA)
.optional()
.build()
)
.optional()
.name(CURRENCIES_SCHEMA_NAME)
.version(FIRST_VERSION)
.build();
SourceTask
return new SourceRecord(
sourcePartition(),
sourceOffset(cryptoNews),
config.getString(TOPIC_CONFIG),
null,
CryptoNewsSchema.NEWS_KEY_SCHEMA,
buildRecordKey(cryptoNews),
CryptoNewsSchema.NEWS_SCHEMA,
buildRecordValue(cryptoNews),
Instant.now().toEpochMilli()
);
public Struct buildRecordValue(CryptoNews cryptoNews){
Struct valueStruct = new Struct(CryptoNewsSchema.NEWS_SCHEMA);
// Produces Invalid Java object for schema type STRUCT: class com.dto.Currencies
List<Currencies> currencies = cryptoNews.getCurrencies();
if (currencies != null) {
valueStruct.put(CurrenciesSchema.CURRENCIES_SCHEMA_NAME, currencies);
}
return valueStruct;
}
UPDATE:
worker.properties
bootstrap.servers=localhost:29092
key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=true
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=true
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter.schemas.enable=true
rest.port=8086
rest.host.name=127.0.0.1
offset.storage.file.filename=offsets/standalone.offsets
offset.flush.interval.ms=10000
You need to provide a List<Struct>
Here's a full unit test example
First, an interface that will help
public interface ConnectPOJOConverter<T> {
Schema getSchema();
T fromConnectData(Struct s);
Struct toConnectData(T t);
}
class ArrayStructTest {
public static final Schema CURRENCY_ITEM_SCHEMA = SchemaBuilder.struct()
.version(1)
.name(Currency.class.getName())
.doc("A currency item")
.field("code", Schema.OPTIONAL_STRING_SCHEMA)
.field("title", Schema.OPTIONAL_STRING_SCHEMA)
.field("slug", Schema.OPTIONAL_STRING_SCHEMA)
.field("url", Schema.OPTIONAL_STRING_SCHEMA)
.build();
static final ConnectPOJOConverter<Currency> CONVERTER = new CurrencyConverter();
#Test
void myTest() {
// Given
List<Currency> currencies = new ArrayList<>();
// TODO: Get from external source
currencies.add(new Currency("200", "Hello", "/slug", "http://localhost"));
currencies.add(new Currency("200", "World", "/slug", "http://localhost"));
// When: build Connect Struct data
Schema valueSchema = SchemaBuilder.struct()
.name("CryptoNews")
.doc("A record holding a list of currency items")
.version(1)
.field("currencies", SchemaBuilder.array(CURRENCY_ITEM_SCHEMA).required().build())
.build();
final List<Struct> items = currencies.stream()
.map(CONVERTER::toConnectData)
.collect(Collectors.toList());
// In the SourceTask, this is what goes into the SourceRecord along with the valueSchema
Struct value = new Struct(valueSchema);
value.put("currencies", items);
// Then
assertDoesNotThrow(value::validate);
Object itemsFromStruct = value.get("currencies");
assertInstanceOf(List.class, itemsFromStruct);
//noinspection unchecked
List<Object> data = (List<Object>) itemsFromStruct; // could also use List<Struct>
assertEquals(2, data.size(), "same size");
assertInstanceOf(Struct.class, data.get(0), "Object list still has type information");
Struct firstStruct = (Struct) data.get(0);
assertEquals("Hello", firstStruct.get("title"));
currencies = data.stream()
.map(o -> (Struct) o)
.map(CONVERTER::fromConnectData)
.filter(Objects::nonNull) // in case converter has errors, could return null
.collect(Collectors.toList());
assertTrue(currencies.size() <= data.size());
assertEquals("World", currencies.get(1).getTitle(), "struct parsing data worked");
}
static class CurrencyConverter implements ConnectPOJOConverter<Currency> {
#Override
public Schema getSchema() {
return CURRENCY_ITEM_SCHEMA;
}
#Override
public Currency fromConnectData(Struct s) {
// simple conversion, but more complex types could throw errors
return new Currency(
s.getString("code"),
s.getString("title"),
s.getString("url"),
s.getString("slug")
);
}
#Override
public Struct toConnectData(Currency c) {
Struct s = new Struct(getSchema());
s.put("code", c.getCode());
s.put("title", c.getTitle());
s.put("url", c.getUrl());
s.put("slug", c.getSlug());
return s;
}
}
}
The alternative approach is to just use a String schema, and use Jackson ObjectMapper to get a JSON string, then let JSONConverter handle the rest.
final ObjectMapper om = new ObjectMapper();
final Schema valueSchema = Schema.STRING_SCHEMA;
output.put("schema", new TextNode("TODO")); // replace with JSONConverter schema
// for-each currency
Map<String, JsonNode> output = new HashMap<>();
try {
output.put("payload", om.readTree(om.writeValueAsBytes(currency))); // write and parse to not double-encode
String value = om.writeValueAsString(output);
SourceRecord r = new SourceRecord(...., valueSchema, value);
records.add(r); // poll return result
} catch (IOException e) {
// TODO: handle
}
// end for-each
return records;

json Array to bean ( object mapper )

Read the json object and store into the bean by creating the new getter n setter. I want to read the bold value from below json object received as string.
[{"country":"**India**","provinces":[{**"province":"India","confirmed":265928,"recovered":129095,"deaths":7473,"active":129360**}],"latitude":20.593684,"longitude":78.96288,"date":"2020-06-08"}]
Bean:
#JsonIgnoreProperties(ignoreUnknown = true)
public class CoronaBean {
private String country; } and other needs to be created
ObjectMapper mapper = new ObjectMapper();
try {
CoronaBean[] coronaBean = mapper.readValue(json, CoronaBean[].class);
for(CoronaBean c: coronaBean ){
System.out.println(c.getCountry());
}
} catch (JsonProcessingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
I am successfully able to read the country but I want to read other values which are in bold above
CoronaBean should contain property provinces, which must be another Bean with properties you want from there. Simple as that.
Look at the code:
#JsonIgnoreProperties(ignoreUnknown = true)
public class CoronaBean {
private String country;
private ProvinceBean[] provinces
...getters and setters
}
#JsonIgnoreProperties(ignoreUnknown = true)
public class ProvinceBean {
private Integer confirmed;
private Integer recovered;
...rest you want and getters and setters
I think you can also check this question for more details and ways to achive what you need:
How to parse JSON in Java

How to be avoid duplicates entries at insert to MongoDB

I just want to be shure when inputting new DBObject to DB that it is really unique and Collection doesn't contain key field duplicates .
Here is how it looks now:
public abstract class AbstractMongoDAO<ID, MODEL> implements GenericDAO<ID, MODEL> {
protected Mongo client;
protected Class<MODEL> model;
protected DBCollection dbCollection;
/**
* Contains model data : unique key name and name of get method
*/
protected KeyField keyField;
#SuppressWarnings("unchecked")
protected AbstractMongoDAO() {
ParameterizedType genericSuperclass = (ParameterizedType) this.getClass().getGenericSuperclass();
model = (Class<MODEL>) genericSuperclass.getActualTypeArguments()[1];
getKeyField();
}
public void connect() throws UnknownHostException {
client = new MongoClient(Config.getMongoHost(), Integer.parseInt(Config.getMongoPort()));
DB clientDB = client.getDB(Config.getMongoDb());
clientDB.authenticate(Config.getMongoDbUser(), Config.getMongoDbPass().toCharArray());
dbCollection = clientDB.getCollection(getCollectionName(model));
}
public void disconnect() {
if (client != null) {
client.close();
}
}
#Override
public void create(MODEL model) {
Object keyValue = get(model);
try {
ObjectMapper mapper = new ObjectMapper();
String requestAsString = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(model);
// check if not presented
BasicDBObject dbObject = new BasicDBObject((String) keyValue, requestAsString);
dbCollection.ensureIndex(dbObject, new BasicDBObject("unique", true));
dbCollection.insert(new BasicDBObject((String) keyValue, requestAsString));
} catch (Throwable e) {
throw new RuntimeException(String.format("Duplicate parameters '%s' : '%s'", keyField.id(), keyValue));
}
}
private Object get(MODEL model) {
Object result = null;
try {
Method m = this.model.getMethod(this.keyField.get());
result = m.invoke(model);
} catch (Exception e) {
throw new RuntimeException(String.format("Couldn't find method by name '%s' at class '%s'", this.keyField.get(), this.model.getName()));
}
return result;
}
/**
* Extract the name of collection that is specified at '#Entity' annotation.
*
* #param clazz is model class object.
* #return the name of collection that is specified.
*/
private String getCollectionName(Class<MODEL> clazz) {
Entity entity = clazz.getAnnotation(Entity.class);
String tableName = entity.value();
if (tableName.equals(Mapper.IGNORED_FIELDNAME)) {
// think about usual logger
tableName = clazz.getName();
}
return tableName;
}
private void getKeyField() {
for (Field field : this.model.getDeclaredFields()) {
if (field.isAnnotationPresent(KeyField.class)) {
keyField = field.getAnnotation(KeyField.class);
break;
}
}
if (keyField == null) {
throw new RuntimeException(String.format("Couldn't find key field at class : '%s'", model.getName()));
}
}
KeyFeld is custom annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface KeyField {
String id();
String get();
String statusProp() default "ALL";
But I'm not shure that this solution really prove this. I'm newly at Mongo.
Any suggestions?
A uniqueness can be maintained in MonboDb using _id field. If we will not provide the value of this field, MongoDB automatically creates a unique id for that particuler collection.
So, in your case just create a property called _id in java & assign your unique field value here. If duplicated, it will throw an exception.
With Spring Data MongoDB (the question was tagged with spring-data, that's why I suggest it), all you need is that:
// Your types
class YourType {
BigInteger id;
#Indexed(unique = true) String emailAddress;
…
}
interface YourTypeRepository extends CrudRepository<YourType, BigInteger> { }
// Infrastructure setup, if you use Spring as container prefer #EnableMongoRepositories
MongoOperations operations = new MongoTemplate(new MongoClient(), "myDatabase");
MongoRepositoryFactory factory = new MongoRepositoryFactory(operations);
YourTypeRepository repository = factory.getRepository(YourTypeRepository.class);
// Now use it…
YourType first = …; // set email address
YourType second = …; // set same email address
repository.save(first);
repository.save(second); // will throw an exception
The crucial part that's most related to your original question is #Indexed as this will cause the required unique index created when you create the repository.
What you get beyond that is:
no need to manually implement any repository (deleted code does not contain bugs \o/)
automatic object-to-document conversion
automatic index creation
powerful repository abstraction to easily query data by declaring query methods
For more details, check out the reference documentation.

cast mongodb query result to data model class in java

I want to create a data model of java class, so that I can automatically get some Properties of the data Model which retrieved from the dbcollection of mongodb by using method defined in the class. Let's say if I have a data structure stored in mongodb collection named "STUDENT" like:
{
"name":"Jone",
"id":"20140314201"
"courses":[
{
"CourseName":"math",
"teacher":"Prof Smith",
"Score":80
},
{
"CourseName":"literature",
"teacher":"Brown"
"Score":58
}
]
}
It's always convenient to define a student class like this:
class Student extends BasicDBObject{
private List<Course> courseList = new ArrayList();
private final String name;
private final String id;
public Student(String _name,String _id){
name = _name;
id = _id;
}
public List<Course> getFailedCourseList(){
List<Course> failedCouseList = blablabla...
return failedCouseList
}
public addCourse(Course _course){
couseList.add(_cousrse);
}
.....
}
The question is can I just do some job to make these happen:
1. when saving a STUDENT items into mongodb I can just do this:
Student studentItem = new Student("Jone","20140314201")
studentItem.addCourse(course1)
studentItem.addCourse(course2)
....
DBC.save(studentItem)
2. when retrieve data from db collection I can just cast BasicDBObject(which is the default object type dbcollection findOne returned) to Class Student I defined:
Student studentJone = (Student)DBC.findOne(new BasicDBObject("name":"Jone"));
so that I can just find Out what courses failed by just invoke a method of the Student class:
List<Course> failedCourseList = studentJone.getFailedCourseList();
Try this:
BasicDBObject query= new BasicDBObject();
query.put("any key","any value"); //THis add criteria
DBObject dbObjectResult = getMongoTemplate().getCollection(COLLECTION)
.findOne(query);
Foo foo = getMongoTemplate().getConverter().read(Foo.class, dbObjectResult);
Should be work
Of course you can
Please see this url : http://docs.mongodb.org/ecosystem/tutorial/use-java-dbobject-to-perform-saves/

Unable to map from mongodb results to java custom object

I have this type of records present inside my collection
{
"_id": {
"$oid": "537470dce4b067b395ba47f2"
},
"symbol": "CMC",
"tvol": 76.97
}
While retrieving results , i am unable to map to to my custom object
This is my client program
public class Data {
public static void main(String[] args) {
try {
String textUri = "mongodb://admin:password1#ds043388.mongolab.com:43388/stocks";
MongoURI uri = new MongoURI(textUri);
Mongo m = new Mongo(uri);
DB db = m.getDB("stocks");
DBCollection table = db.getCollection("stock");
BasicDBObject searchQuery = new BasicDBObject();
DBCursor cursor = table.find(searchQuery);
while (cursor.hasNext()) {
Security sec = (Security) cursor.next();
System.out.println(sec.getSymbol());
}
} catch (UnknownHostException e) {
e.printStackTrace();
} catch (MongoException e) {
e.printStackTrace();
}
}
}
Security.java
package com;
import com.mongodb.BasicDBObject;
public class Security extends BasicDBObject {
public Security() {
}
private String symbol;
public String getSymbol() {
return symbol;
}
public void setSymbol(String symbol) {
this.symbol = symbol;
}
}
This is the exception i am getting .
Exception in thread "main" java.lang.ClassCastException: com.mongodb.BasicDBObject cannot be cast to com.Security
at com.Data.main(Data.java:25)
You need to tell the driver what class to instantiate, try adding this line before you run your find():
table.setObjectClass(Security.class);
Your getters on your Security class aren't really complete here, you'd need to do something like this:
public String getSymbol() {
return getString("symbol");
}
When MongoDB Driver returns it return as BasicDBObject for an Object by default which is kind of Map.
if you want to retrive as your own Java POJO , you have to implement DbObject Interface
Reference
You are extending BasicDBObject which is internally implementing DBObject , so you are fine in that context.
However you have explicitly tell to Mongo to return as Security as
collection.setObjectClass(Security.class);

Categories

Resources