Hi SO i'm newbie in Vaadin, tries to create form and bind to that POJO object.
...Some declaration
Binder<User> binder = new Binder<>(User.class);
#Autowired
public FormUser(UserRepository userRepository, AuthorityRepository authorityRepository){
this.userRepository = userRepository;
this.authorityRepository = authorityRepository;
authorities = new ListSelect<>("Authorities", authorityRepository.findAll());
authorities.setItemCaptionGenerator(Authority::getAuthority);
//Set items
username.setIcon(FontAwesome.USER);
password.setIcon(FontAwesome.USER_SECRET);
saveButton.addClickListener(e -> {
userRepository.save(user);
});
setSpacing(true);
addComponents(username, password, authorities, saveButton);
binder.bindInstanceFields(this);
}
When try to access view that contains FormUser get this error:
java.lang.IllegalStateException: Property type 'java.util.Collection' doesn't match the field type 'java.util.Set< dev.gva.model.Authority >'. Binding should be configured manually using converter.
Authority :
public class Authority{
private Long id;
private String authority;
getter/setters..
}
User:
public class User{
private Long id;
private Collection<Authority> authorities;
other fields, getters/setters...
}
How to write this converter? Thanks
Instead adding boilerplate code for conversions you should use Set or List for your authorities-attribute within the User-class. An advantage of that would be also that no duplicates are allowed in authorities. The difference between Set and List is that List is ordered and could be accessed by an index. You decide what you need, but Set might be enough.
Related
I am using Spring Repositories with Redis and want to store data about user for 10 seconds and expire them (and delete them) from redis.
I know expiring and deleting is different, but is there an easy way to delete them like I am expiring them automatically.
I have the following entity
#RedisHash(value = "User", timeToLive = 10)
public class User {
#Id
private String id;
#Indexed
#ApiModelProperty(notes = "First name of the user")
private String firstName;
#ApiModelProperty(notes = "Last name of the user")
private String lastName;
...
...
}
Repository
#Repository
public interface UserRepository extends CrudRepository<User, String> {
}
Configuration for Redis
#Configuration
public class RedisConfig {
#Value("${redis.hostname}")
private String redisHostname;
#Value("${redis.port}")
private int redisPort;
#Bean
public JedisConnectionFactory jedisConnectionFactory() {
RedisStandaloneConfiguration redisStandaloneConfiguration = new RedisStandaloneConfiguration(redisHostname, redisPort);
return new JedisConnectionFactory(redisStandaloneConfiguration);
}
#Bean
public RedisTemplate<String, Object> redisTemplate() {
RedisTemplate<String, Object> template = new RedisTemplate<>();
template.setConnectionFactory(jedisConnectionFactory());
return template;
}
}
When I get all the entities with findAll method from the repository if they are expired I get a bunch of null values, and I can see that they are in the redis with a redis client. I am worried that this will fill the db with a lot of expired data. Is there a way to delete the expired entities.
The short answer is:
Put the following annotation:
#EnableRedisRepositories(enableKeyspaceEvents = EnableKeyspaceEvents.ON_STARTUP)
Above your main class (SpringBootApplication)
Objects will now get removed from the redis store :)
When the expiration is set to a positive value, the corresponding EXPIRE command is executed. In addition to persisting the original, a phantom copy is persisted in Redis and set to expire five minutes after the original one.
for more information, please reference here
Hope this post helps you.
I have a configuration class like below. All of fields in the inner class OptionalServiceConfigs has a default value as annotated using #Value as shown in below.
Sometimes in my application.properties file, it does not have a single service prefixed property. In that case, we want to have loaded an OptionalServiceConfigs instance with its default field values.
#Configuration
#ConfigurationProperties(prefix = "myconf")
public class MyConfigs {
// ... rest of my configs
#Value("${service:?????}") // what to put here, or can I?
private OptionalServiceConfigs service; // this is null
// In this class all fields have a default value.
public static class OptionalServiceConfigs {
#Value("${mode:local}")
private String mode;
#Value("${timeout:30000}")
private long timeout;
// ... rest of getter and setters
}
// ... rest of getter and setters
}
But unfortunately, the service field is null when it is accessed using its getter method. Because spring boot does not initialize an instance of it when there is no property keys found with prefixed myconf.service.* in my application.properties file.
Question:
How can I make service field to initialize to a new instance along with its specified default field values when there are no corresponding prefixed keys in properties file?
I can't imagine a value to put in annotation #Value("${service:?????}") for service field.
Nothing works, tried, #Value("${service:}") or #Value("${service:new")
Based on #M. Deinum's advice, did some changes to configuration class. I am a newbie to Spring and it seems I have misunderstood how Spring works behind-the-scenes.
First I removed all #Value annotation from inner class (i.e. OptionalServiceConfigs), and as well as service field in MyConfigs class.
Then, initialized all inner class fields with their default values inline.
In the constructor of MyConfigs, I initialized a new instance of OptionalServiceConfigs for the field service.
By doing this, whenever there is no service related keys in my application.properties a new instance has already been created with default values.
When there is/are service related key/s, then Spring does override my default values to the specified values in application.properties only the field(s) I've specified.
I believe from Spring perspective that there is no way it can know in advance that a referencing field (i.e. service field) would be related to the configurations, when none of its keys exist in the configuration file. That must be the reason why Spring does not initialize it. Fair enough.
Complete solution:
#Configuration
#ConfigurationProperties(prefix = "myconf")
public class MyConfigs {
// ... rest of my configs
private OptionalServiceConfigs service;
public static class OptionalServiceConfigs {
private String mode = "local";
private long timeout = 30000L;
// ... rest of getter and setters
}
public MyConfigs() {
service = new OptionalServiceConfigs();
}
// ... rest of getter and setters
}
you can try such a structure which works for me quite fine:
#Data
#Validated
#ConfigurationProperties(prefix = "gateway.auth")
#Configuration
public class AuthProperties {
#NotNull
private URL apiUrl;
#Valid
#NotNull
private Authentication authentication;
#Data
public static class Authentication {
#NotNull
private Duration accessTokenTtl;
#NotNull
private String accessTokenUri;
#NotNull
private String clientId;
#NotNull
private String clientSecret;
#NotNull
private String username;
#NotNull
private String password;
#Min(0)
#NonNull
private Integer retries = 0;
}
}
Important is to have getters and setters in order to enable Spring to postprocess ConfigurationProperties, I am using Lombok (#Data) for this.
please see here for more details:
Baeldung ConfigurationProperties Tutorial
I'm new to MongoDB and Reactor and I'm trying to retrieve a User with its Profiles associated
Here's the POJO :
public class User {
private #Id String id;
private String login;
private String hashPassword;
#Field("profiles") private List<String> profileObjectIds;
#Transient private List<Profile> profiles; }
public class Profile {
private #Id String id;
private #Indexed(unique = true) String name;
private List<String> roles; }
The problem is, how do I inject the profiles in the User POJO ?
I'm aware I can put a #DBRef and solve the problem but in it's documentation, MongoDB specify manual Ref should be preferred over DB ref.
I'm seeing two solutions :
Fill the pojo when I get it :
public Mono<User> getUser(String login) {
return userRepository.findByLogin(login)
.flatMap(user -> ??? );
}
I should do something with profileRepository.findAllById() but I don't know or to concatene both Publishers given that profiles result depends on user result.
Declare an AbstractMongoEventListener and override onAfterConvert method :
But here I am mistaken since the method end before the result is Published
public void onAfterConvert(AfterConvertEvent<User> event) {
final User source = event.getSource();
source.setProfiles(new ArrayList<>());
profileRepository.findAllById(source.getProfileObjectIds())
.doOnNext(e -> source.getProfiles().add(e))
subscribe();
}
TL;DR
There's no DBRef support in reactive Spring Data MongoDB and I'm not sure there will be.
Explanation
Spring Data projects are organized into Template API, Converter and Mapping Metadata components. The imperative (blocking) implementation of the Template API uses an imperative approach to fetch Documents and convert these into domain objects. MappingMongoConverter in particular handles all the conversion and DBRef resolution. This API works in a synchronous/imperative API and is used for both Template API implementations (imperative and the reactive one).
Reuse of MappingMongoConverter was the logical decision while adding reactive support as we don't have a need to duplicate code. The only limitation is DBRef resolution that does not fit the reactive execution model.
To support reactive DBRefs, the converter needs to be split up into several bits and the whole association handling requires an overhaul.
Reference : https://jira.spring.io/browse/DATAMONGO-2146
Recommendation
Keep references as keys/Id's in your domain model and look up these as needed. zipWith and flatMap are the appropriate operators, depending on what you want to archive (enhance model with references, lookup references only).
On a related note: Reactive Spring Data MongoDB comes partially with a reduced feature set. Contextual SpEL extension is a feature that is not supported as these components assume an imperative programming model and thus synchronous execution.
For the first point, I finally achieve doing what I wanted :
public Mono<User> getUser(String login) {
return userRepository.findByLogin(login)
.flatMap( user ->
Mono.just(user)
.zipWith(profileRepository.findAllById(user.getProfileObjectIds())
.collectionList(),
(u, p) -> {
u.setProfiles(p);
return u;
})
);
}
In my case, I have managed this problem using the follow approuch:
My Entity is:
#Getter
#Setter
#NoArgsConstructor
#AllArgsConstructor
#Document(collection = "post")
public class Post implements Serializable {
private static final long serialVersionUID = -6281811500337260230L;
#EqualsAndHashCode.Include
#Id
private String id;
private Date date;
private String title;
private String body;
private AuthorDto author;
private Comment comment;
private List<Comment> listComments = new ArrayList<>();
private List<String> idComments = new ArrayList<>();
}
My controller is:
#GetMapping(FIND_POST_BY_ID_SHOW_COMMENTS)
#ResponseStatus(OK)
public Mono<Post> findPostByIdShowComments(#PathVariable String id) {
return postService.findPostByIdShowComments(id);
}
Last, but not, least, my Service (here is the solution):
public Mono<Post> findPostByIdShowComments(String id) {
return postRepo
.findById(id)
.switchIfEmpty(postNotFoundException())
.flatMap(postFound -> commentService
.findCommentsByPostId(postFound.getId())
.collectList()
.flatMap(comments -> {
postFound.setListComments(comments);
return Mono.just(postFound);
})
);
}
public Flux<Comment> findCommentsByPostId(String id) {
return postRepo
.findById(id)
.switchIfEmpty(postNotFoundException())
.thenMany(commentRepo.findAll())
.filter(comment1 -> comment1.getIdPost()
.equals(id));
}
Thanks, this helped a lot.
Here is my solution:
public MappingMongoConverter mappingMongoConverter(MongoMappingContext mongoMappingContext) {
MappingMongoConverter converter = new MappingMongoConverter(NoOpDbRefResolver.INSTANCE, mongoMappingContext);
converter.setTypeMapper(new DefaultMongoTypeMapper(null));
converter.setCustomConversions(mongoCustomConversions());
return converter;
}
The trick was to use the NoOpDbRefResolver.INSTANCE
I am experimenting with spring data elasticsearch by implementing a cluster which will host multi-tenant indexes, one index per tenant.
I am able to create and set settings dynamically for each needed index, like
public class SpringDataES {
#Autowired
private ElasticsearchTemplate es;
#Autowired
private TenantIndexNamingService tenantIndexNamingService;
private void createIndex(String indexName) {
Settings indexSettings = Settings.builder()
.put("number_of_shards", 1)
.build();
CreateIndexRequest indexRequest = new CreateIndexRequest(indexName, indexSettings);
es.getClient().admin().indices().create(indexRequest).actionGet();
es.refresh(indexName);
}
private void preapareIndex(String indexName){
if (!es.indexExists(indexName)) {
createIndex(indexName);
}
updateMappings(indexName);
}
The model is created like this
#Document(indexName = "#{tenantIndexNamingService.getIndexName()}", type = "movies")
public class Movie {
#Id
#JsonIgnore
private String id;
private String movieTitle;
#CompletionField(maxInputLength = 100)
private Completion movieTitleSuggest;
private String director;
private Date releaseDate;
where the index name is passed dynamically via the SpEl
#{tenantIndexNamingService.getIndexName()}
that is served by
#Service
public class TenantIndexNamingService {
private static final String INDEX_PREFIX = "test_index_";
private String indexName = INDEX_PREFIX;
public TenantIndexNamingService() {
}
public String getIndexName() {
return indexName;
}
public void setIndexName(int tenantId) {
this.indexName = INDEX_PREFIX + tenantId;
}
public void setIndexName(String indexName) {
this.indexName = indexName;
}
}
So, whenever I have to execute a CRUD action, first I am pointing to the right index and then to execute the desired action
tenantIndexNamingService.setIndexName(tenantId);
movieService.save(new Movie("Dead Poets Society", getCompletion("Dead Poets Society"), "Peter Weir", new Date()));
My assumption is that the following dynamically index assignment, will not work correctly in a multi-threaded web application:
#Document(indexName = "#{tenantIndexNamingService.getIndexName()}"
This is because TenantIndexNamingService is singleton.
So my question is how achieve the right behavior in a thread save manner?
I would probably go with an approach similar to the following one proposed for Cassandra:
https://dzone.com/articles/multi-tenant-cassandra-cluster-with-spring-data-ca
You can have a look at the related GitHub repository here:
https://github.com/gitaroktato/spring-boot-cassandra-multitenant-example
Now, since Elastic has differences in how you define a Document, you should mainly focus in defining a request-scoped bean that will encapsulate your tenant-id and bind it to your incoming requests.
Here is my solution. I create a RequestScope bean to hold the indexes per HttpRequest
how does singleton bean handle dynamic index
I'm using spring-data-elasticsearch and for the beginning everything works fine.
#Document( type = "products", indexName = "empty" )
public class Product
{
...
}
public interface ProductRepository extends ElasticsearchRepository<Product, String>
{
...
}
In my model i can search for products.
#Autowired
private ProductRepository repository;
...
repository.findByIdentifier( "xxx" ).getCategory() );
So, my problem is - I've the same Elasticsearch type in different indices and I want to use the same document for all queries. I can handle more connections via a pool - but I don't have any idea how I can implement this.
I would like to have, something like that:
ProductRepository customerRepo = ElasticsearchPool.getRepoByCustomer("abc", ProductRepository.class);
repository.findByIdentifier( "xxx" ).getCategory();
Is it possible to create a repository at runtime, with an different index ?
Thanks a lot
Marcel
Yes. It's possible with Spring. But you should use ElasticsearchTemplate instead of Repository.
For example. I have two products. They are stored in different indices.
#Document(indexName = "product-a", type = "product")
public class ProductA {
#Id
private String id;
private String name;
private int value;
//Getters and setters
}
#Document(indexName = "product-b", type = "product")
public class ProductB {
#Id
private String id;
private String name;
//Getters and setters
}
Suppose if they have the same type, so they have the same fields. But it's not necessary. Two products can have totally different fields.
I have two repositories:
public interface ProductARepository extends ElasticsearchRepository<ProductA, String> {
}
public interface ProductBRepository
extends ElasticsearchRepository<ProductB, String> {
}
It's not necessary too. Only for testing. The fact that ProductA is stored in "product-a" index and ProductB is stored in "product-b" index.
How to query two(ten, dozen) indices with the same type?
Just build custom repository like this
#Repository
public class CustomProductRepositoryImpl {
#Autowired
private ElasticsearchTemplate elasticsearchTemplate;
public List<ProductA> findProductByName(String name) {
MatchQueryBuilder queryBuilder = QueryBuilders.matchPhrasePrefixQuery("name", name);
//You can query as many indices as you want
IndicesQueryBuilder builder = QueryBuilders.indicesQuery(queryBuilder, "product-a", "product-b");
SearchQuery searchQuery = new NativeSearchQueryBuilder().withQuery(builder).build();
return elasticsearchTemplate.query(searchQuery, response -> {
SearchHits hits = response.getHits();
List<ProductA> result = new ArrayList<>();
Arrays.stream(hits.getHits()).forEach(h -> {
Map<String, Object> source = h.getSource();
//get only id just for test
ProductA productA = new ProductA()
.setId(String.valueOf(source.getOrDefault("id", null)));
result.add(productA);
});
return result;
});
}
}
You can search as many indices as you want and you can transparently inject this behavior into ProductARepository adding custom behavior to single repositories
Second solution is to use indices aliases, but you had to create custom model or custom repository too.
We can use the withIndices method to switch the index if needed:
NativeSearchQueryBuilder nativeSearchQueryBuilder = nativeSearchQueryBuilderConfig.getNativeSearchQueryBuilder();
// Assign the index explicitly.
nativeSearchQueryBuilder.withIndices("product-a");
// Then add query as usual.
nativeSearchQueryBuilder.withQuery(allQueries)
The #Document annotation in entity will only clarify the mapping, to query against a specific index, we still need to use above method.
#Document(indexName="product-a", type="_doc")