Unit testing sprig data mongodb customer converters - java

I am trying unit test spring-data-mongodb custom converter. I am following this document. As per the document there should be a method called afterMappingMongoConverterCreation in AbstractMongoConfiguration class and we need to override that method to configure custom converter. Interestingly that method is not found in the version 1.3.1. (The document is for the same version) The same document also talking about a method named setCustomConverters in MappingMongoConverter. I don't see that method also in MappingMongoConverter or it's super class. Am I missing something here? Any help is much appreciated.
If the document is outdated what is the best way to unit test customer converters? Any option other than XML configuration?

Looks like the document is bit outdated. I got it fixed using the below given code.
#EnableMongoRepositories
#ComponentScan(basePackageClasses = { ItemRepository.class })
#PropertySource("classpath:application.properties")
static class MongoConfiguration extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
return "scrumretro-test";
}
#Override
public Mongo mongo() {
return new Fongo("mongo-test").getMongo();
}
#Override
protected String getMappingBasePackage() {
return "com.scrumretro.repository.mongo";
}
#Bean
public CustomConversions customConversions() {
List<Converter<?, ?>> converters = new ArrayList<Converter<?, ?>>();
converters.add(new ItemWriteConverter());
return new CustomConversions(converters);
}
}`

Related

Testing Spring Hateoas Application with RepresentationModelAssembler

I'm trying to test my Spring Hateoas application, more specifically the controllers, using Springs #WebMvcTest. But I'm having problems injecting my custom RepresentationModelAssembler into the test.
First a bit of my setup:
I'm using a custom RepresentationModelAssembler to turn my DB-Models into DTOs, which have all necessary links added.
The RepresentationModelAssembler:
#Component
public class BusinessUnitAssembler implements RepresentationModelAssembler<BusinessUnit, BusinessUnitDto> {
private final Class<BusinessUnitController> controllerClass = BusinessUnitController.class;
private final BusinessUnitMapper businessUnitMapper;
public BusinessUnitAssembler(BusinessUnitMapper businessUnitMapper) {
this.businessUnitMapper = businessUnitMapper;
}
#Override
public BusinessUnitDto toModel(BusinessUnit entity) {
return businessUnitMapper.businessUnitToDto(entity)
.add(linkTo(methodOn(controllerClass).findById(entity.getId())).withSelfRel());
}
}
The BusinessUnitMapper used here is a Mapstruct mapper, which is injected by spring. In my Service I use the BusinessUnitAssembler to turn my DB-Models into DTOs, example Service method:
public Page<BusinessUnitDto> findAll(Pageable pageable) {
Page<BusinessUnit> pagedResult = businessUnitRepository.findAll(pageable);
if (pagedResult.hasContent()) {
return pagedResult.map(businessUnitAssembler::toModel);
} else {
return Page.empty();
}
}
This is how I'm doing the testing currently:
#WebMvcTest(controllers = BusinessUnitController.class)
public class BusinessUnitControllerTest {
#Autowired
private MockMvc mockMvc;
#MockBean
private BusinessUnitService businessUnitService;
private BusinessUnitMapper mapper = Mappers.getMapper(BusinessUnitMapper.class);
private BusinessUnitAssembler assembler = new BusinessUnitAssembler(mapper);
#Test
public void getAllShouldReturnAllBusinessUnits() throws Exception {
List<BusinessUnitDto> businessUnits = Stream.of(
new BusinessUnit(1L, "Personal"),
new BusinessUnit(2L, "IT")
).map(businessUnit -> assembler.toModel(businessUnit)).collect(Collectors.toList());
when(businessUnitService.findAll(Pageable.ofSize(10))).thenReturn(new PageImpl<>(businessUnits));
mockMvc.perform(get("/businessUnits").accept(MediaTypes.HAL_JSON))
.andDo(print())
.andExpect(status().isOk())
.andExpect(jsonPath("$.*", hasSize(3)))
// ... do more jsonPath checking
}
}
But I'd like to have Spring inject the BusinessUnitAssembler, instead of constructing it myself. I've tried #Importing BusinessUnitAssembler as well as the BusinessUnitMapper and I've also tried it by using a custom #Configuration but I just couldn't get it to work.
So my Question is: How can I let Spring inject my BusinessUnitAssembler into the test for me instead of assembling it myself?
Additional Question: Is it valid to combine the Mapping from Database Entity to DTO in the RepresentationModelAssembler or should those two steps be kept seperate from each other?

spring-data-cassandra CassandraBatchTemplate is not public

I want to use CassandraBatchTemplate's withTimestamp to insert client side timestamp like USING TIMESTAMP clause in the CQL. here is my code:
#Bean
public DseSession dseSession(DseCluster dseCluster) {
return dseCluster.connect(keyspace);
}
#Bean
public CassandraOperations cassandraTemplate(DseSession session) {
return new CassandraTemplate(session);
}
#Bean
public CassandraBatchOperations cassdraBatchTemplate(CassandraOperations cassandraTemplate) {
return new CassandraBatchTemplate(cassandraTemplate);
}
when compiled it complained cannot find CassandraBatchTemplate even though i can see it in spring-data-cassandra source code. one thing i noticed is that CassandraBatchTemplate is default implementation of interface CassandraBatchOperations, thus no 'public' is applied to CassandraBatchTemplate class:
class CassandraBatchTemplate implements CassandraBatchOperations {...}
if the class is not public then I cannot create an instance of it by 'new'. how to work around? I'm using spring-data-cassandra 2.1.10.RELEASE and dse-java-driver-core 1.8.2
CassandraBatchTemplate isn't public because it has a very limited lifecycle. It isn't intended to be used as #Bean because it is only valid for a single execution.
Instead, obtain CassandraBatchOperations through CassandraOperations.batchOps().

Spring Data Mongo - Inheritance and Embeddable

Let's say I have a Customer entity with a list of Vehicles:
#Document
public class Customer {
private List<Vehicle> vehicles;
//... getters, setters
}
Vehicle is an abstract class with a few subtypes:
public abstract class Vehicle {
}
#TypeAlias("CAR")
public class Car {
}
#TypeAlias("BOAT")
public class Boat {
}
#TypeAlias("MOTORBIKE")
public class Motorbike {
}
Is there any way to have Spring handle this use case? i.e. if I save a Car and a Boat against a customer, have them correctly hydrate when querying the Customer? At the moment, I'm getting a java.lang.InstantiationError as Spring Data seems to be trying to create an instance of the Vehicle abstract class.
Managed to resolve the issue.
Basically, I needed to add the package containing the Vehicle classes to get scanned to my Mongo Configuration class as follows:
public class CustomerDbConfig extends AbstractMongoConfiguration {
...
#Override
protected Collection<String> getMappingBasePackages() {
Collection<String> mappingBasePackages = new ArrayList<>();
mappingBasePackages.add(Vehicle.class.getPackageName());
return mappingBasePackages;
}
}
I think the above should work in most cases. My understanding is the above might not be necessary if Configuration class is within the same package as the Vehicle classes however, in my case, they're in two different packages.
Additionally, it was a bit more complicated on my side since I have multiple Mongo databases with different configurations and different MongoTemplate beans.
Initially, I was creating the MongoTemplate as follows:
#Primary
#Bean(name = "customerMongoTemplate")
public MongoTemplate customerMongoTemplate() {
MongoTemplate mongoTemplate = new MongoTemplate(mongoClient(), getDatabaseName());
MappingMongoConverter converter = (MappingMongoConverter) mongoTemplate.getConverter();
converter.setCustomConversions(customConversions());
converter.afterPropertiesSet();
return mongoTemplate;
}
However, getting the MappingMongoConverter this way, via the MongoTemplate means that getMappingBasePackages was never being called. I tried to get converter instead by doing the following:
#Primary
#Bean(name = "customerMongoTemplate")
public MongoTemplate customerMongoTemplate() {
return new MongoTemplate(mongoDbFactory(), mappingMongoConverter());
}
however it didn't work, as the mongoDbFactory() and mappingMongoConverter() were returning beans for a different MongoDB configuration... This would be the ideal solution for me, but not sure how to get it working reliably with multiple configuration classes.
In the end, I managed to get working reliably as follows:
#Primary
#Bean(name = "customerMongoTemplate")
public MongoTemplate customerMongoTemplate() throws Exception {
SimpleMongoDbFactory mongoDbFactory = new SimpleMongoDbFactory(mongoClient(), getDatabaseName());
MongoMappingContext mongoMappingContext = mongoMappingContext();
mongoMappingContext.setInitialEntitySet(getInitialEntitySet());
mongoMappingContext.afterPropertiesSet();
MappingMongoConverter converter = new MappingMongoConverter(new DefaultDbRefResolver(mongoDbFactory), mongoMappingContext);
converter.setCustomConversions(customConversions());
converter.afterPropertiesSet();
return new MongoTemplate(mongoDbFactory, converter);
}
I'm not entirely comfortable with the above approach, it's somewhat finicky and could potentially cause issues with new versions of Spring, however, it does work.

How to configure multiple couchbase data source using springboot-data-couchbase?

I am trying configuring multiple couchbase data source using springboot-data-couchbase.
This is a way I tried to attach two couchbase sources with 2 repositories.
#Configuration
#EnableCouchbaseRepositories("com.xyz.abc")
public class AbcDatasource extends AbstractCouchbaseConfiguration {
#Override
protected List<String> getBootstrapHosts() {
return Collections.singletonList("ip_address_of_couchbase");
}
//bucket_name
#Override
protected String getBucketName() {
return "bucket_name";
}
//password
#Override
protected String getBucketPassword() {
return "user_password";
}
#Override
#Bean(destroyMethod = "disconnect", name = "COUCHBASE_CLUSTER_2")
public Cluster couchbaseCluster() throws Exception {
return CouchbaseCluster.create(couchbaseEnvironment(), "ip_address_of_couchbase");
}
#Bean( name = "BUCKET2")
public Bucket bucket2() throws Exception {
return this.couchbaseCluster().openBucket("bucket2", "somepassword");
}
#Bean( name = "BUCKET2_TEMPLATE")
public CouchbaseTemplate newTemplateForBucket2() throws Exception {
CouchbaseTemplate template = new CouchbaseTemplate(
couchbaseClusterInfo(), //reuse the default bean
bucket2(), //the bucket is non-default
mappingCouchbaseConverter(), translationService()
);
template.setDefaultConsistency(getDefaultConsistency());
return template;
}
#Override
public void configureRepositoryOperationsMapping(RepositoryOperationsMapping baseMapping) {
baseMapping
.mapEntity(SomeDAOUsedInSomeRepository.class, newTemplateForBucket2());
}
}
similarly:
#Configuration
#EnableCouchbaseRepositories("com.xyz.mln")
public class MlnDatasource extends AbstractCouchbaseConfiguration {...}
Now the problem is there is no straight forward way to specify namespace based datasource by attaching different beans to these configurations like in springdata-jpa as springdata-jpa support this feature do using entity-manager-factory-ref and transaction-manager-ref.
Due to which only one configuration is being picked whoever comes first.
Any suggestion is greatly appreciated.
Related question: Use Spring Data Couchbase to connect to different Couchbase clusters
#anshul you are almost there.
Make one of the Data Source as #primary which will be used as by default bucket.
Wherever you want to use the other bucket .Just use specific bean in your service class with the qualifier below is the example:
#Qualifier(value = "BUCKET1_TEMPLATE")
#Autowired
CouchbaseTemplate couchbaseTemplate;
Now you can use this template to perform all couch related operations on the desired bucket.

How to load data in specified database by Spring Data Mongodb?

I have configured Spring Mongodb project to load data in database named "warehouse". Here is how my config class looks like
#Configuration
public class SpringMongoConfig extends AbstractMongoConfiguration {
#Override
protected String getDatabaseName() {
return "warehouse";
}
public #Bean Mongo mongo() throws Exception {
return new Mongo("localhost");
}
public #Bean MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongo(), getDatabaseName());
}
}
But Spring is always using the default database "test" to store and retrieve the collections. I have tried different approaches to point it to "warehouse" db. But it doesnt seem to work. What am doing wrong? Any leads are appreciated.
Assuming you have a standard mongo install (e.g., the database is at a default such as /data/db or C:\data\db), your configuration class looks correct. How are you using it? Can you try:
SpringMongoConfig config = new SpringMongoConfig();
MongoTemplate template = config.mongoTemplate();
template.createCollection("someCollection");
From a shell, if you then log into mongo and enter show dbs, do you not see a warehouse"?

Categories

Resources