currently I have got this class which implements the Builder pattern, for sake of readibility I have chosen to omit some methods, more precisely I only show the build methods of username.
package dao.constraint;
import java.util.Arrays;
public class AccountConstraint {
private Constraint<Range<Integer>> accountIdConstraint;
private Constraint<String> usernameConstraint;
private Constraint<String> passwordConstraint;
private Constraint<String> emailConstraint;
private AccountConstraint(Builder builder) {
this.accountIdConstraint = builder.accountIdConstraint;
this.usernameConstraint = builder.usernameConstraint;
this.passwordConstraint = builder.passwordConstraint;
this.emailConstraint = builder.emailConstraint;
}
public Constraint<Range<Integer>> getAccountIdConstraint() {
return accountIdConstraint;
}
public Constraint<String> getUsernameConstraint() {
return usernameConstraint;
}
public Constraint<String> getPasswordConstraint() {
return passwordConstraint;
}
public Constraint<String> getEmailConstraint() {
return emailConstraint;
}
public Constraint[] getConstraints() {
return Arrays.asList(this.getAccountIdConstraint(), this.getUsernameConstraint(), this.getPasswordConstraint(), this.getEmailConstraint()).toArray(new Constraint[4]);
}
public static class Builder {
private Constraint<Range<Integer>> accountIdConstraint;
private Constraint<String> usernameConstraint;
private Constraint<String> passwordConstraint;
private Constraint<String> emailConstraint;
public Builder() {
this.accountIdConstraint = null;
this.usernameConstraint = null;
this.passwordConstraint = null;
this.emailConstraint = null;
}
public Builder username(final String username) {
this.usernameConstraint = new Constraint<>(Operation.IS, true, username, "username");
return this;
}
public Builder notUsername(final String username) {
this.usernameConstraint = new Constraint<>(Operation.IS, false, username, "username");
return this;
}
public Builder usernameLike(final String username) {
this.usernameConstraint = new Constraint<>(Operation.LIKE, true, username, "username");
return this;
}
public Builder usernameNotLike(final String username) {
this.usernameConstraint = new Constraint<>(Operation.LIKE, false, username, "username");
return this;
}
public AccountConstraint build() {
return new AccountConstraint(this);
}
}
}
As you can see there is very subtle difference between AccountConstraint.Builder.username(String s) and AccountConstraint.Builder.notUsername(String s).
I would like to be able to write something like new AccountConstraint.Builder().not(username(s));. However as I know this is not valid Java syntax if username(String s) is not defined in the calling Java class. I neither wish to repeat the whole AccountConstraint.Builder() again to reach the username(String s) part. Any solutions?
Second question: Can AccountConstraint.getConstraints() be improved or written more simple?
Regards.
you could make not a method of your builder, setting a flag, which then negates the next constraint.
private boolean negate = false;
public Builder not() {
negate = true;
}
public Builder username(final String username) {
this.usernameConstraint = new Constraint<>(Operation.IS, !negate, username, "username");
negate = false;
return this;
}
For your second question:
public Constraint[] getConstraints() {
return Arrays.asList(this.getAccountIdConstraint(),
this.getUsernameConstraint(),
this.getPasswordConstraint(),
this.getEmailConstraint())
.toArray(new Constraint[4]);
}
can be re-written to :
public Constraint[] getConstraints() {
return new Constraint[] {
this.accountIdConstraint,
this.usernameConstraint,
this.passwordConstraint,
this.emailConstraint
};
}
But IMO, returning a List or Set would be better than an array.
What I find extremely elegant in this situations is to write a utility class with static factory methods like.
public static Constraint userName(...) { ... }
and to import static blabla.Utility.username;
Then you can write almost declarative human-readable queries in java. This is very much as for the hamcrest library for unit testing where you write something like.
Assert.assertThat(blabla, is(equalTo(nullValue()));
In this case Not should implement Constraint and just negates the nested (referenced) constraint like this:
public static Constraint not(Constraint negated) { return new Not(negated); }
this results in code like
PreparedStatement ps = new QueryBuilder()
.select()
.from(table("accounts")
.where(not(username(equalTo("blabla")))
.compile();
You can add static factories for boolean combinations:
.where(and(
.not(...),
.not(or(...))
Defining constraints like this (static factory methods as opposed to adding them to the builder) thus makes them easily composable.
Related
I've been struggling for a while trying to find a solution to this problem. Hope you can help me out.
I'm trying to generate a method that calls a static method from another class using some already defined fields:
class Test {
private String someField;
private String otherField;
}
Expected result:
class Test {
private String someField;
private String otherField;
public String getCacheKey() {
return SimpleCacheKey.of(this.someField, this.otherField);
}
}
class SimpleCacheKey {
public static String of(final Object... values) {
// Some Operations
return computed_string;
}
}
I've tried several things, closest one:
public class ModelProcessor implements Plugin {
#Override
public Builder<?> apply(final Builder<?> builder,
final TypeDescription typeDescription,
final ClassFileLocator classFileLocator) {
return builder.defineMethod("getCacheKey", String.class, Visibility.PUBLIC)
.intercept(new SimpleCacheKeyImplementation());
}
#Override
public void close() throws IOException {
}
#Override
public boolean matches(final TypeDescription typeDefinitions) {
return true;
}
}
public class SimpleCacheKeyImplementation implements Implementation {
private static final MethodDescription SIMPLE_CACHE_KEY_OF = getOf();
#SneakyThrows
private static MethodDescription.ForLoadedMethod getOf() {
return new MethodDescription.ForLoadedMethod(SimpleCacheKey.class.getDeclaredMethod("of", Object[].class));
}
#Override
public InstrumentedType prepare(final InstrumentedType instrumentedType) {
return instrumentedType;
}
#Override
public ByteCodeAppender appender(final Target implementationTarget) {
final TypeDescription thisType = implementationTarget.getInstrumentedType();
return new ByteCodeAppender.Simple(Arrays.asList(
// first param
MethodVariableAccess.loadThis(),
this.getField(thisType, "someField"),
// second param
MethodVariableAccess.loadThis(),
this.getField(thisType, "otherField"),
// call of and return the result
MethodInvocation.invoke(SIMPLE_CACHE_KEY_OF),
MethodReturn.of(TypeDescription.STRING)
));
}
private StackManipulation getField(final TypeDescription thisType, final String name) {
return FieldAccess.forField(thisType.getDeclaredFields()
.filter(ElementMatchers.named(name))
.getOnly()
).read();
}
}
However, generated code is as follows (decompiled with Intellij Idea):
public String getCacheKey() {
String var10000 = this.name;
return SimpleCacheKey.of(this.someValue);
}
Changing the signature of SimpleCacheKey.of and trying to workaround the problem with a List is not an option.
You are calling a vararg method, java bytecode doesnt have that. So you need to create an actual array of the correct type to call the method.
#Override
public ByteCodeAppender appender(final Target implementationTarget) {
final TypeDescription thisType = implementationTarget.getInstrumentedType();
return new ByteCodeAppender.Simple(Arrays.asList(ArrayFactory.forType(TypeDescription.Generic.OBJECT)
.withValues(Arrays.asList( //
new StackManipulation.Compound(MethodVariableAccess.loadThis(),
this.getField(thisType, "field1")),
new StackManipulation.Compound(MethodVariableAccess.loadThis(),
this.getField(thisType, "field2")))
), MethodInvocation.invoke(SIMPLE_CACHE_KEY_OF) //
, MethodReturn.of(TypeDescription.STRING)));
}
Maybe byte-buddy has a special builder for that, but at least thats one way of doing that.
Imo: it is often a good approach to write a java version of the bytecode you want to generate. That way you can compare the javac bytecode and bytebuddy bytecode.
I am trying to generate a very simple code with Byte Buddy.
I have a POJO class where some fields are annotated with #SecureAttribute, For such fields I would like to override getter implementation and redirect the call to a SecurityService.getSecureValue() implementation.
Original class:
public class Properties {
#SecureAttribute
protected String password;
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
}
Desired Proxy:
public class PropertiesProxy {
private SecurityService securityService;
public void setSecurityService(SecurityService var1) {
this.securityService = var1;
}
public SecurityService getSecurityService() {
return this.securityService;
}
#Override
public String getPassword() {
return securityService.getSecureValue(password);
}
}
Emitting a field was easy but overriding a method becomes complicated. I have found a number of samples relative to my task which I try to apply but do not seem to get the required result.
So my major question is: how do I trace and debug the code generator? First thing I've learned was to print the class to file:
DynamicType.Unloaded<?> unloadedType = byteBuddy.make();
unloadedType.saveIn(new File("d:/temp/bytebuddy"));
This gives me an output where the extra field was added but not a glance of the getter override (disassembled from .class file):
public class PropertiesImpl$ByteBuddy$OLlyZYNY extends PropertiesImpl {
private SecurityService securityService;
public void setSecurityService(SecurityService var1) {
this.securityService = var1;
}
public SecurityService getSecurityService() {
return this.securityService;
}
public PropertiesImpl$ByteBuddy$OLlyZYNY() {
}
}
Here I do not exactly understand how to look for the error. Does it mean that I used totally wrong method implementation and Byte Buddy simply skipped it? Or am I wrong with ElementMatchers? Is there some trace or whatever that will give me a clue how to fix my code?
Current implementation:
private Class<?> wrapProperties() throws IOException {
DynamicType.Builder<?> byteBuddy = new ByteBuddy()
.subclass(PropertiesImpl.class)
.defineProperty("securityService", SecurityService.class);
Arrays.stream(PropertiesImpl.class.getDeclaredFields())
.filter(item -> item.getAnnotation(SecureAttribute.class) != null)
.forEach(item -> byteBuddy
.method(ElementMatchers.named(getGetterBeanName(item)))
.intercept(new GetterWrapperImplementation(item)));
DynamicType.Unloaded<?> unloadedType = byteBuddy.make();
unloadedType.saveIn(new File("d:/temp/bytebuddy"));
Class<?> wrapperClass = unloadedType.load(PropertiesImpl.class.getClassLoader(), ClassLoadingStrategy.Default.WRAPPER)
.getLoaded();
return wrapperClass;
}
public static class GetterWrapperImplementation implements Implementation {
public static final TypeDescription SS_TYPE;
public static final MethodDescription SS_GET_SECURE_VALUE;
private final Field filed;
static {
try {
SS_TYPE = new TypeDescription.ForLoadedType(SecurityService.class);
SS_GET_SECURE_VALUE = new MethodDescription.ForLoadedMethod(SecurityService.class.getDeclaredMethod("getSecureValue", String.class));
}
catch (final NoSuchMethodException | SecurityException e) {
throw new RuntimeException(e);
}
}
public GetterWrapperImplementation(Field filed) {
this.filed = filed;
}
#Override
public InstrumentedType prepare(final InstrumentedType instrumentedType) {
return instrumentedType;
}
#Override
public ByteCodeAppender appender(final Target implementationTarget) {
final TypeDescription thisType = implementationTarget.getInstrumentedType();
return new ByteCodeAppender.Simple(Arrays.asList(
TypeCreation.of(SS_TYPE),
// get securityService field
MethodVariableAccess.loadThis(),
FieldAccess.forField(thisType.getDeclaredFields()
.filter(ElementMatchers.named("securityService"))
.getOnly()
).read(),
// get secure field
MethodVariableAccess.loadThis(),
FieldAccess.forField(thisType.getDeclaredFields()
.filter(ElementMatchers.named(filed.getName()))
.getOnly()
).read(),
MethodInvocation.invoke(SS_GET_SECURE_VALUE),
MethodReturn.of(TypeDescription.STRING)
));
}
}
What I know for the fact is that breakpoints inside ByteCodeAppender appender(final Target implementationTarget) do not get hit, but again not sure how to interpret this.
Thanks.
The Byte Buddy DSL is immutable. This means that you always have to call:
builder = builder.method(...).intercept(...);
Your forEach does not do what you expect for this reason.
As for your implementation, you can just use MethodCall on a field and define the other field as an argument.
I've searched the internet for something to similar to what I'm doing, but haven't found anything. I'm using interfaces in Java 8 to create a Builder pattern, like so:
public class UrlImmutable {
public final String parentUrl;
public final Double parentUrlSentiment;
public final Set<String> childUrls;
public final boolean isParentVendorUrl;
public final Map<TagClassification, Set<String>> parentUrlArticleTags;
private UrlImmutable(String parentUrl, Double parentUrlSentiment, Set<String> childUrls, boolean isParentVendorUrl,
Map<TagClassification, Set<String>> parentUrlArticleTags ) {
super();
this.parentUrl = parentUrl;
this.parentUrlSentiment = parentUrlSentiment;
this.childUrls = childUrls;
this.isParentVendorUrl = isParentVendorUrl;
this.parentUrlArticleTags = parentUrlArticleTags;
}
/** Our Interfaces for the Builder **/
public interface ParentUrlBuilder {
ParentUrlSentimentBuilder parentUrl(String parentUrl);
}
public interface ParentUrlSentimentBuilder {
ChildUrlBuilder parentUrlSentiment(Double parentUrlSentiment);
}
public interface ChildUrlBuilder {
IsVendorUrlBuilder childUrls(Set<String> childUrls);
}
public interface IsVendorUrlBuilder {
ParentUrlArticleTagsBuilder isParentVendorUrl(boolean isParentVendorUrl);
}
public interface ParentUrlArticleTagsBuilder {
UrlImmutable parentUrlArticleTags(Map<TagClassification,Set<String>> parentUrlArticleTags);
}
public static ParentUrlBuilder discoveredUrl() {
return parentUrl -> parentUrlSentiment -> childUrls -> isParentVendorUrl -> parentUrlArticleTags ->
new UrlImmutable(parentUrl, parentUrlSentiment, childUrls, isParentVendorUrl, parentUrlArticleTags);
}
}
And to construct this object, we do this:
UrlImmutable url =
UrlImmutable()
.parentUrl("http://www.google.com")
.parentUrlSentiment(10.5)
.childUrls(childUrls)
.isParentVendorUrl(true)
.parentUrlArticleTags(parentUrlArticleTags);
I can't seem to find the right combination of annotations for this. Any help much appreciated!
I have two classes as shown below. I need to use these two classes to extract few things.
public final class ProcessMetadata {
private final String clientId;
private final String deviceId;
// .. lot of other fields here
// getters here
}
public final class ProcMetadata {
private final String deviceId;
private final Schema schema;
// .. lot of other fields here
}
Now I have below code where I am iterating above two classes and extracting schema given a clientId.
public Optional<Schema> getSchema(final String clientId) {
for (ProcessMetadata metadata1 : processMetadataList) {
if (metadata1.getClientId().equalsIgnoreCase(clientId)) {
String deviceId = metadata1.getDeviceId();
for (ProcMetadata metadata2 : procMetadataList) {
if (metadata2.getDeviceId().equalsIgnoreCase(deviceId)) {
return Optional.of(metadata2.getSchema());
}
}
}
}
return Optional.absent();
}
Is there any better way of getting what I need by iterating those two above classes in couple of lines instead of what I have? I am using Java 7.
You're doing a quadratic* search operation, which is inneficient. You can do this operation in constant time by first creating (in linear time) a mapping from id->object for each list. This would look something like this:
// do this once, in the constructor or wherever you create these lists
// even better discard the lists and use the mappings everywhere
Map<String, ProcessMetadata> processMetadataByClientId = new HashMap<>();
for (ProcessMetadata process : processMetadataList) {
processMetadataByClientId.put(process.getClientId(), process);
}
Map<String, ProcMetadata> procMetadataByDeviceId = new HashMap<>();
for (ProcMetadata metadata2 : procMetadataList) {
procMetadataByDeviceId.put(proc.getDeviceId(), proc);
}
Then your lookup simply becomes:
public Optional<Schema> getSchema(String clientId) {
ProcessMetadata process = processMetadataByClientId.get(clientId);
if (process != null) {
ProcMetadata proc = procMetadataByDeviceId.get(process.getDeviceId());
if (proc != null) {
return Optional.of(proc.getSchema());
}
}
return Optional.absent();
}
In Java 8 you could write it like this:
public Optional<Schema> getSchema(String clientId) {
return Optional.fromNullable(processMetadataByClientId.get(clientId))
.map(p -> procMetadataByDeviceId.get(p.getDeviceId()))
.map(p -> p.getSchema());
}
* In practice your algorithm is linear assuming client IDs are unique, but it's still technically O(n^2) because you potentially touch every element of the proc list for every element of the process list. A slight tweak to your algorithm can guarentee linear time (again assuming unique IDs):
public Optional<Schema> getSchema(final String clientId) {
for (ProcessMetadata metadata1 : processMetadataList) {
if (metadata1.getClientId().equalsIgnoreCase(clientId)) {
String deviceId = metadata1.getDeviceId();
for (ProcMetadata metadata2 : procMetadataList) {
if (metadata2.getDeviceId().equalsIgnoreCase(deviceId)) {
return Optional.of(metadata2.getSchema());
}
}
// adding a break here ensures the search doesn't become quadratic
break;
}
}
return Optional.absent();
}
Though of course using maps ensures constant-time, which is far better.
I wondered what could be done with Guava, and accidentally wrote this hot mess.
import static com.google.common.collect.Iterables.tryFind
public Optional<Schema> getSchema(final String clientId) {
Optional<String> deviceId = findDeviceIdByClientId(clientId);
return deviceId.isPresent() ? findSchemaByDeviceId(deviceId.get()) : Optional.absent();
}
public Optional<String> findDeviceIdByClientId(String clientId) {
return tryFind(processMetadataList, new ClientIdPredicate(clientId))
.transform(new Function<ProcessMetadata, String>() {
String apply(ProcessMetadata processMetadata) {
return processMetadata.getDeviceId();
}
});
}
public Optional<Schema> findSchemaByDeviceId(String deviceId) {
return tryFind(procMetadataList, new DeviceIdPredicate(deviceId.get())
.transform(new Function<ProcMetadata, Schema>() {
Schema apply(ProcMetadata procMetadata) {
return processMetadata.getSchema();
}
});
}
class DeviceIdPredicate implements Predicate<ProcMetadata> {
private String deviceId;
public DeviceIdPredicate(String deviceId) {
this.deviceId = deviceId;
}
#Override
public boolean apply(ProcMetadata metadata2) {
return metadata2.getDeviceId().equalsIgnoreCase(deviceId)
}
}
class ClientIdPredicate implements Predicate<ProcessMetadata> {
private String clientId;
public ClientIdPredicate(String clientId) {
this.clientId = clientId;
}
#Override
public boolean apply(ProcessMetadata metadata1) {
return metadata1.getClientId().equalsIgnoreCase(clientId);
}
}
Sorry.
I've created a class User that extends Document. User just has some simple constructors and getters/setters around some strings and ints. However, when I try to insert the User class into Mongo I get the following error:
Exception in thread "main" org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class com.foo.User.
at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46)
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63)
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:37)
at org.bson.BsonDocumentWrapper.asBsonDocument(BsonDocumentWrapper.java:62)
at com.mongodb.MongoCollectionImpl.documentToBsonDocument(MongoCollectionImpl.java:507)
at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:292)
at com.mongodb.MongoCollectionImpl.insertMany(MongoCollectionImpl.java:282)
at com.foo.bar.main(bar.java:27)
Sounds like I need to work with some Mongo Codecs stuff, but I'm not familiar with it and some quick googling returns some results that seem pretty advanced.
How do I properly write my User class for use in Mongo? Here is my class for reference:
public class User extends Document {
User(String user, List<Document > history, boolean isActive, String location){
this.append("_id", user)
.append("history", history)
.append("isActive", isActive)
.append("location", location);
}
public List<Document > getHistory(){
return this.get("history", ArrayList.class);
}
public void addToHistory(Document event){
List<Document> history = this.getHistory();
history.add(event);
this.append("history", history);
}
public boolean hasMet(User otherUser){
List<String> usersIveMet = this.getUsersMet(),
usersTheyMet = otherUser.getUsersMet();
return !Collections.disjoint(usersIveMet, usersTheyMet);
}
public List<String> getUsersMet() {
List<Document> usersHistory = this.getHistory();
List<String> usersMet = usersHistory.stream()
.map(doc -> Arrays.asList(doc.getString("user1"), doc.getString("user1")))
.filter(u -> !u.equals(this.getUser()))
.flatMap(u -> u.stream())
.collect(Collectors.toList());
return usersMet;
}
public String getUser(){
return this.getString("_id");
}
}
Since you are trying to create new object (even if you extend from Document), Mongo has no way to recognize it and therefore you need to provide encoding/decoding in order to let Mongo to know about your object (at least I cannot see other way than this..).
I played with your User class a bit and get it work.
So, here is how I defined a User class:
public class User {
private List<Document> history;
private String id;
private Boolean isActive;
private String location;
// Getters and setters. Omitted for brevity..
}
Then you need provide encoding/decoding logic to your User class:
public class UserCodec implements Codec<User> {
private CodecRegistry codecRegistry;
public UserCodec(CodecRegistry codecRegistry) {
this.codecRegistry = codecRegistry;
}
#Override
public User decode(BsonReader reader, DecoderContext decoderContext) {
reader.readStartDocument();
String id = reader.readString("id");
Boolean isActive = reader.readBoolean("isActive");
String location = reader.readString("location");
Codec<Document> historyCodec = codecRegistry.get(Document.class);
List<Document> history = new ArrayList<>();
reader.readStartArray();
while (reader.readBsonType() != BsonType.END_OF_DOCUMENT) {
history.add(historyCodec.decode(reader, decoderContext));
}
reader.readEndArray();
reader.readEndDocument();
User user = new User();
user.setId(id);
user.setIsActive(isActive);
user.setLocation(location);
user.setHistory(history);
return user;
}
#Override
public void encode(BsonWriter writer, User user, EncoderContext encoderContext) {
writer.writeStartDocument();
writer.writeName("id");
writer.writeString(user.getId());
writer.writeName("isActive");
writer.writeBoolean(user.getIsActive());
writer.writeName("location");
writer.writeString(user.getLocation());
writer.writeStartArray("history");
for (Document document : user.getHistory()) {
Codec<Document> documentCodec = codecRegistry.get(Document.class);
encoderContext.encodeWithChildContext(documentCodec, writer, document);
}
writer.writeEndArray();
writer.writeEndDocument();
}
#Override
public Class<User> getEncoderClass() {
return User.class;
}
}
Then you need a codec provided for type checking before starting serialization/deserialization.
public class UserCodecProvider implements CodecProvider {
#Override
#SuppressWarnings("unchecked")
public <T> Codec<T> get(Class<T> clazz, CodecRegistry registry) {
if (clazz == User.class) {
return (Codec<T>) new UserCodec(registry);
}
return null;
}
}
And finally, you need to register your provider to your MongoClient, that's all.
public class MongoDb {
private MongoDatabase db;
public MongoDb() {
CodecRegistry codecRegistry = CodecRegistries.fromRegistries(
CodecRegistries.fromProviders(new UserCodecProvider()),
MongoClient.getDefaultCodecRegistry());
MongoClientOptions options = MongoClientOptions.builder()
.codecRegistry(codecRegistry).build();
MongoClient mongoClient = new MongoClient(new ServerAddress(), options);
db = mongoClient.getDatabase("test");
}
public void addUser(User user) {
MongoCollection<User> collection = db.getCollection("user").withDocumentClass(User.class);
collection.insertOne(user);
}
public static void main(String[] args) {
MongoDb mongoDb = new MongoDb();
Document history1 = new Document();
history1.append("field1", "value1");
history1.append("field2", "value2");
history1.append("field3", "value3");
List<Document> history = new ArrayList<>();
history.add(history1);
User user = new User();
user.setId("someId1");
user.setIsActive(true);
user.setLocation("someLocation");
user.setHistory(history);
mongoDb.addUser(user);
}
}
A bit late but just stumbled across the issue and was also somewhat disappointed by the work involved in the proposed solutions so far. Especially since it requires tons of custom code for every single Document extending class you wish to persist and might also exhibit sub-optimal performance noticeable in large data sets.
Instead I figured one might piggyback off DocumentCodec like so (Mongo 3.x):
public class MyDocumentCodec<T extends Document> implements CollectibleCodec<T> {
private DocumentCodec _documentCodec;
private Class<T> _class;
private Constructor<T> _constructor;
public MyDocumentCodec(Class<T> class_) {
try {
_documentCodec = new DocumentCodec();
_class = class_;
_constructor = class_.getConstructor(Document.class);
} catch (Exception ex) {
throw new MCException(ex);
}
}
#Override
public void encode(BsonWriter writer, T value, EncoderContext encoderContext) {
_documentCodec.encode(writer, value, encoderContext);
}
#Override
public Class<T> getEncoderClass() {
return _class;
}
#Override
public T decode(BsonReader reader, DecoderContext decoderContext) {
try {
Document document = _documentCodec.decode(reader, decoderContext);
T result = _constructor.newInstance(document);
return result;
} catch (Exception ex) {
throw new MCException(ex);
}
}
#Override
public T generateIdIfAbsentFromDocument(T document) {
if (!documentHasId(document)) {
Document doc = _documentCodec.generateIdIfAbsentFromDocument(document);
document.put("_id", doc.get("_id"));
}
return document;
}
#Override
public boolean documentHasId(T document) {
return _documentCodec.documentHasId(document);
}
#Override
public BsonValue getDocumentId(T document) {
return _documentCodec.getDocumentId(document);
}
}
This is then registered along the lines of
MyDocumentCodec<MyClass> myCodec = new MyDocumentCodec<>(MyClass.class);
CodecRegistry codecRegistry = CodecRegistries.fromRegistries(MongoClient.getDefaultCodecRegistry(),
CodecRegistries.fromCodecs(myCodec));
MongoClientOptions options = MongoClientOptions.builder().codecRegistry(codecRegistry).build();
MongoClient dbClient = new MongoClient(new ServerAddress(_dbServer, _dbPort), options);
Switching to this approach along with bulking up some operations (which probably has a large effect) I just managed to run an operation that previously took several hours to 30 mins. The decode method can probably be improved but my main concern was inserts for now.
Hope this helps someone. Please let me know if you see issues with this approach.
Thanks.
Have you tried using the #Embedded and #JsonIgnoreProperties(ignoreUnknown = true) on top of your class signature?
This worked for me when I had a similar issue. I had a model (Translation) which I was storing in a HashMap member field of another model (Promo).
Once I added these annotations to the Translation class signature, the issue went away. Not sure if it'll work that way in your case but worth trying.
I have to explore more on this myself.