Spring 3+. Resolve properties by prefix in a placeholder - java

In my Java class I have a field of type, say, java.util.Properties or java.util.Map.
I'd like Spring to inject the field with all properties which start with the given prefix. Ideally by specifying wildcard in the property annotation, smth like #Value(${prefix.*}) that doesn't seem to work unfortunately.
How can I achieve that?

AFAIK there isn't a direct way to achieve what you are looking for.
You can however use Spring EL in combination with custom implementation of org.springframework.core.io.support.PropertiesLoaderSupport.
First extend the PropertiesLoaderSupport (or any of it's subclass). Introduce a method (say filterProps), which will return java.util.Properties object with filtered properties, such as below
public class CustomPropertyLoader extends PropertiesLoaderSupport {
public Properties filterProperties(String prefix){
Properties temp = new Properties();
for(Properties props : this.localProperties){
/*
* Iterate over props and filter them as
* per prefix (or any custom logic) to temp
*/
}
return temp;
}
}
TIP - You can use a regex for more generic scenarios instead of String as method argument.
Second define a corresponding bean of above class
<bean id="customPropertyLoader" class="x.y.z.CustomPropertyLoader">
<property name="locations" value="classpath*:/**/some*.properties"/>
</bean>
Last annotate the field of type java.util.Properties in your class as
#Value(#{customPropertyLoader.filterProperties('prefix.*')})
P.S.: Please excuse any compilation / syntax error as I haven't compiled and check the setup; but it should work. In case not, let know in comments.

Related

how to set a global property that set on when spring boot application start

I want to set up some global variables that are accessible by multiple classes. Examples of these global variables would be things like some key(Strings)
I am fetching these variable from database and variables would probably not change except when the program is re-compiled
The easiest way is to define a #Component class with field of typeMap for these properties. Then populate it at the start of your application with information retrieved from database.
Then, whenever you want to use these properties inject these using Spring Boot's DI mechanism.
You did not provide much details so the answer will be generic too.
There basically are two approaches:
You could use the Java Properties class.
public static final Properties defaultProperties = new Properties();
Initialize your defaultProperties from database at program start with defaultProperties.put("name", value).
Access your properties by defaultProperties.get("name").
Write your own configuration class.
class MyConfig
{
public final String SomeStringProperty;
public final int SomeIntProperty;
// Singleton
public final static MyConfig instance = new MyConfig();
private MyConfig()
{ // Init properties from database here.
}
}
You might need some dependency injection pattern to initialize MyConfig, e.g. to establish the database connection.
Both methods are similar. The second one provides more type safety and prevents you from accidentally accessing a non existent property because of a typo in the property name. The first one in contrast can be made generic in a way that no code changes to the configuration code are required when new properties are added. Of course, you still have to write code that accesses the new property.
You can set this properties on application.properties file.You can use this properties all over on your project.
For example:I am using key for hit third-party .So I am taking key on my properties file.

CDI: Dynamical injection of a group of classes how to?

I need to dynamically Inject a variable group of classes in my application. The purpose is, as the application grows, only have to add more classes inheriting the same interface. This is easy to do with tradicional java as I just need to search for all classes in a package and perform a loop to instantiate them. I want to do it in CDI. For example:
public MyValidatorInterface {
public boolean validate();
}
#Named
MyValidator1 implements MyValidatorInterface
...
#Named
MyValidator2 implements MyValidatorInterface
...
Now the ugly non real java code just to get the idea of what I want to do:
public MyValidatorFactory {
for (String className: classNames) {
#Inject
MyValidatorInterface<className> myValidatorInstance;
myValidatorInstance.validate();
}
}
I want to loop over all implementations found in classNames list (all will be in the same package BTW) and Inject them dynamically so if next week I add a new validator, MyValidator3, I just have to code the new class and add it to the project. The loop in MyValidatorFactory will find it, inject it and execute the validate() method on the new class too.
I have read about dynamic injection but I can't find a way to loop over a group of class names and inject them just like I used to Instantiate them the old way.
Thanks
What you are describing is what Instance<T> does.
For your sample above, you would do:
`#Inject Instance<MyValidatorInterface> allInstances`
Now, allInstances variable contains all your beans which have the given Type (MyValidatorInterface). You can further narrow down the set by calling select(..) based on qualifiers and/or class of bean. This will again return an Instance but with only a subset of previously fitting beans. Finally, you call get() which retrieves the bean instance for you.
NOTE: if you call get() straight away (without select) in the above case, you will get an exception because you have two beans of given type and CDI cannot determine which one should be used. This is implied by rules of type-safe resolution.
What you most likely want to know is that Instance<T> also implements Iterable so that's how you get to iterate over the beans. You will want to do something like this:
#Inject
Instance<MyValidatorInterface> allInstances;
public void validateAll() {
Iterator<MyValidatorInterface> iterator = allInstances.iterator();
while (iterator.hasNext()) {
iterator.next().callYourValidationMethod();
}}
}

Get bean with specific Arguments to constructor in Spring

assume I have that class
public Student{
private String name;
private Address address;
public Student(String fName, Address address){
name = fname;
this.address = address;
}
I defined this class within Spring configuration as
<bean name="studentInstance" class="StackOverFlow.Student"/>
now i'd like to use getBean with parameter I will pass to constructor.
equal to Student s = new Student(name,address)
I know Spring supplies a methond getBean(class_name,parms....)
however I dont know how I should config Spring.xml configuration file.
I would like to avoid using Setter and getter in order to fill a new bean.
I found lots of example of how to define </constructor-arg> within the xml but each time it was with default values. here I let the user to enter different values for each object.
I'd like to use
ApplicationContext context = new ClassPathXmlApplicationContext(Spring.xml file path);
Student s= (Student)context.getBean("studentInstance",name,address);
I need help with the configuration file only
Thanks in Advance!!
I already checked those links :
Link1 Link2 Link3 Link4
~~~~~Edit ~~~~~~~
Solved! constructor-injection is not needed here
I just added prototype scope to my bean as shown below.
<bean name="carInstance" class="MainApp.bl.GasStation.Car" scope="prototype"/>
Firstly, such bean must obviously be declared as prototype.
The Prototype scopes a single bean definition to have any number of object instances. If scope is set to prototype, the Spring IoC container creates new bean instance of the object every time a request for that specific bean is made
<
Object getBean(String name, Object... args)throws BeansException
Return an instance, which may be shared or independent, of the specified bean.
Allows for specifying explicit constructor arguments / factory method arguments, overriding the specified default arguments (if any) in the bean definition.
Refer to following question for configuration:
Spring <constructor-arg> element must specify a ref or value
Note, you will have to wrap primitives into their Wrapper objects to avoid having predefined values when object is created.

How to autowire a list of properties with Spring? [duplicate]

I've been searching but cannot find these steps. I hope I'm missing something obvious.
I have a properties file with the following contents:
machines=A,B
I have another file like that but having a different number of members in the machines element like this:
machines=B,C,D
My question is how do I load this variable-length machines variable into a bean in my spring config in a generic way?
something like this:
<property name="machines" value="${machines}"/>
where machines is an array or list in my java code. I can define it however I want if I can figure out how to do this.
Basically I'd rather have spring do the parsing and stick each value into a list element instead of me having to write something that reads in the full machines string and do the parsing myself (with the comma delimiter) to put each value into an array or list. Is there an easy way to do this that I'm missing?
You may want to take a look at Spring's StringUtils class. It has a number of useful methods to convert a comma separated list to a Set or a String array. You can use any of these utility methods, using Spring's factory-method framework, to inject a parsed value into your bean. Here is an example:
<property name="machines">
<bean class="org.springframework.util.StringUtils" factory-method="commaDelimitedListToSet">
<constructor-arg type="java.lang.String" value="${machines}"/>
</bean>
</property>
In this example, the value for 'machines' is loaded from the properties file.
If an existing utility method does not meet your needs, it is pretty straightforward to create your own. This technique allows you to execute any static utility method.
Spring EL makes easier.
Java:
List <String> machines;
Context:
<property name="machines" value="#{T(java.util.Arrays).asList('${machines}')}"/>
If you make the property "machines" a String array, then spring will do it automatically for you
machines=B,C,D
<property name="machines" value="${machines}"/>
public void setMachines(String[] test) {
Since Spring 3.0, it is also possible to read in the list of values using the #Value annotation.
Property file:
machines=B,C,D
Java code:
import org.springframework.beans.factory.annotation.Value;
#Value("#{'${machines}'.split(',')}")
private List<String> machines;
You can inject the values to the list directly without boilerplate code.(Spring 3.1+)
#Value("${machines}")
private List<String> machines;
for the key "machines=B,C,D" in the properties file by creating following two instance in your configuration.
#Bean
public static PropertySourcesPlaceholderConfigurer propertyPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
#Bean
public ConversionService conversionService() {
return new DefaultConversionService();
}
Those will cover all the separate based split and whitespace trim as well.

Spring Data, Mongo, and #TypeAlias: reading not working

The issue
Awhile back I started using MongoDB and Spring Data. I'd left most of the default functionality in place, and so all of my documents were stored in MongoDB with a _class field pointing to the entity's fully-qualified class name.
Right away that didn't "smell" right to me, but I left it alone. Until recently, when I refactored a bunch of code, and suddenly none of my documents could be read back from MongoDB and converted into their (refactored/renamed) Java entities. I quickly realized that it was because there was now a fully-qualified-classname mismatch. I also quickly realized that--given that I might refactor again sometime in the future--if I didn't want all of my data to become unusable I'd need to figure something else out.
What I've tried
So that's what I'm doing, but I've hit a wall. I think that I need to do the following:
Annotate each entity with #TypeAlias("ta") where "ta" is a unique, stable string.
Configure and use a different TypeInformationMapper for Spring Data to use when converting my documents back into their Java entities; it needs to know, for example, that a type-alias of "widget.foo" refers to com.myapp.document.FooWidget.
I determined that I should use a TypeInformationMapper of type org.springframework.data.convert.MappingContextTypeInformationMapper. Supposedly a MappingContextTypeInformationMapper will scan my entities/documents to find #TypeAlias'ed documents and store an alias->to->class mapping. But I can't pass that to my MappingMongoConverter; I have to pass a subtype of MongoTypeMapper. So I am configuring a DefaultMongoTypeMapper, and passing a List of one MappingContextTypeInformationMapper as its "mappers" constructor arg.
Code
Here's the relevant part of my spring XML config:
<bean id="mongoTypeMapper" class="org.springframework.data.mongodb.core.convert.DefaultMongoTypeMapper">
<constructor-arg name="typeKey" value="_class"></constructor-arg>
<constructor-arg name="mappers">
<list>
<ref bean="mappingContextTypeMapper" />
</list>
</constructor-arg>
</bean>
<bean id="mappingContextTypeMapper" class="org.springframework.data.convert.MappingContextTypeInformationMapper">
<constructor-arg ref="mappingContext" />
</bean>
<bean id="mappingMongoConverter"
class="org.springframework.data.mongodb.core.convert.MappingMongoConverter">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="mappingContext" />
<property name="mapKeyDotReplacement" value="__dot__" />
<property name="typeMapper" ref="mongoTypeMapper"/>
</bean>
<bean id="mongoTemplate" class="org.springframework.data.mongodb.core.MongoTemplate">
<constructor-arg ref="mongoDbFactory" />
<constructor-arg ref="mappingMongoConverter" />
</bean>
Here's a sample entity/document:
#Document(collection="widget")
#TypeAlias("widget.foo")
public class FooWidget extends Widget {
// ...
}
One important note is that any such "Widget" entity is stored as a nested document in Mongo. So in reality you won't really find a populated "Widget" collection in my MongoDB instance. Instead, a higher-level "Page" class can contain multiple "widgets" like so:
#Document(collection="page")
#TypeAlias("page")
public class Page extends BaseDocument {
// ...
private List<Widget> widgets = new ArrayList<Widget>();
}
The error I'm stuck on
What happens is that I can save a Page along with a number of nested Widgets in Mongo. But when I try to read said Page back out, I get something like the following:
org.springframework.beans.BeanInstantiationException: Could not instantiate bean class [com.myapp.document.Widget]: Is it an abstract class?
I can indeed see pages in Mongo containing "_class" : "page", with nested widgets also containing "_class" : "widget.foo" It just appears like the mapping is not being applied in the reverse.
Is there anything I might be missing?
In the default setting, the MappingMongoConverter creates a DefaultMongoTypeMapper which in turn creates a MappingContextTypeInformationMapper.
That last class is the one responsible for maintaining the typeMap cache between TypeInformation and aliases.
That cache is populated in two places:
In the constructor, for each mappingContext.getPersistentEntities()
When writing an object of an aliased type.
So if you want to make sure the alias is recognized in any context, you need to make sure that all your aliased entities are part of mappingContext.getPersistentEntities().
How you do that depends on your configuration. For instance:
if you're using AbstractMongoConfiguration, you can overwrite its getMappingBasePackage() to return the name of a package containing all of your entities.
if you're using spring boot, you can use #EntityScan to declare which packages to scan for entities
in any case, you can always configure it with a custom set (from a static list or a custom scan) using mongoMappingContext.setInitialEntitySet()
One side note, for an entity to be discovered by a scan, it has to be annotated with either #Document or #Persitent.
More informations can be found in the spring-data-commons Developer Guide
I spent a bunch of time with my debugger and the Spring Data source code, and I learned that Spring Data isn't as good as it probably should be with polymorphism as it should be, especially given the schema-less nature of NoSQL solutions like MongoDB. But ultimately what I did was to write my own type mapper, and that wasn't too tough.
The main problem was that, when reading in my Page document, the default mappers used by Spring Data would see a collection called widgets, then consult the Page class to determine that widgets pointed to a List, then consult the Widget class to look for #TypeAlias information. What I needed instead was a mapper that scanned my persistent entities up front and stored an alias-to-class mapping for later use. That's what my custom type mapper does.
I wrote a blog post discussing the details.
If you extend AbstractMongoConfiguration, you can override method getMappingBasePackage to specify the base package for your documents.
#Configuration
class RepositoryConfig extends AbstractMongoConfiguration {
#Override
protected String getMappingBasePackage() {
return "com.example";
}
}
Update: In spring-data-mongodb 2+ you should use:
#Configuration
class RepositoryConfig extends AbstractMongoConfiguration {
#Override
protected Collection<String> getMappingBasePackages(){
return Arrays.asList("com.example");
}
}
because getMappingBasePackage() is no deprecated and won't work.
Today I ran into the exact same issue. After more research I found out that my subclass was missing a repository. It appears that Spring Data is using the repositories to determine which concrete subclass to create and when it is missing, it falls back to the superclass which in this case is abstract.
So please try to add a FooWidgetRepository and map it to FooWidget with correct ID type. It might work in your case as well.
If you use spring boot with auto-configuration, declaring the following bean can help:
#Bean
MongoMappingContext mongoMappingContext(ApplicationContext applicationContext, MongoCustomConversions conversions) throws ClassNotFoundException {
MongoMappingContext context = new MongoMappingContext();
context.setInitialEntitySet(new EntityScanner(applicationContext).scan(Persistent.class));
context.setSimpleTypeHolder(conversions.getSimpleTypeHolder());
return context;
}
what does the trick is the following line:
new EntityScanner(applicationContext).scan(Persistent.class)
Instead of scanning for Documents it will scan for both Document and TypeAlias, since both of these annotations are Persistent
Andreas Svensson is right, this can be done much simpler than described by Dave Taubler.
I posted a slightly more elaborate answer than Andreas' (including sample code) in this post. Excerpt:
So all you need to do is to declare an "unused" Repository-Interface
for your sub-classes, just like you proposed as "unsafe" in your OP:
public interface NodeRepository extends MongoRepository<Node, String> {
// all of your repo methods go here
Node findById(String id);
Node findFirst100ByNodeType(String nodeType);
... etc.
}
public interface LeafType1Repository extends MongoRepository<LeafType1, String> {
// leave empty
}
public interface LeafType2Repository extends MongoRepository<LeafType2, String> {
// leave empty
}

Categories

Resources