Wire multiple config classes using dependency injection - java

I have working configuration class in spring. I tried to replace hard-coded string by configuration map using dependency injection.
#Configuration
#Component
public class BwlConfiguration {
#Resource(name="loadParameters")
private Map<ConfigEnum, String> conf;
private String address;
public BwlConfiguration() {
address = conf.get(SPI_BL);
}
...
}
Class that provides conf map:
#Configuration
#Component
public class ConfigLoader {
#Resource(name="returnEnv")
private Map<String, String> env;
#Bean
public Map<ConfigEnum, String> loadParameters() throws ParameterNotSetException{
....
return parameterMap;
}
Class that provides env map:
#Configuration
public class EnvConf {
#Bean
public Map<String, String> returnEnv(){
return System.getenv();
}
}
When I run the program, nullPointerException is thrown at address = conf.get(SPI_BL); line. I tried to replace #Component by #Import(...class), same result and it's losing the point of injection.
Am I using these annotations wrong? Thanks

I replaced constructor in BwlConfiguration with:
#Bean
public String address(){
return conf.get(SPI_BL);
}

Related

SpringBoot selecting the #Repository based on design pattern and configuration

Small question on Spring Boot, and how to use a design pattern combined with Spring #Value configuration in order to select the appropriate #Repository please.
Setup: A springboot project which does nothing but save a pojo. The "difficulty" is the need to choose where to save the pojo, based on some info from inside the payload request.
I started with a first straightforward version, which looks like this:
#RestController
public class ControllerVersionOne {
#Autowired private ElasticRepository elasticRepository;
#Autowired private MongoDbRepository mongoRepository;
#Autowired private RedisRepository redisRepository;
//imagine many more other repositories
//imagine many more other repositories
//imagine many more other repositories
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
if (whereToSave.equals("elastic")) {
return elasticRepository.save(myPojo).toString();
} else if (whereToSave.equals("mongo")) {
return mongoRepository.save(myPojo).toString();
} else if (whereToSave.equals("redis")) {
return redisRepository.save(myPojo).toString();
// imagine many more if
// imagine many more if
// imagine many more if
} else {
return "unknown destination";
}
}
With the appropriate #Configuration and #Repository for each and every databases. I am showing 3 here, but imagine many. The project has a way to inject future #Configuration and #Repository as well (the question is not here actually)
#Configuration
public class ElasticConfiguration extends ElasticsearchConfiguration {
#Repository
public interface ElasticRepository extends CrudRepository<MyPojo, String> {
#Configuration
public class MongoConfiguration extends AbstractMongoClientConfiguration {
#Repository
public interface MongoDbRepository extends MongoRepository<MyPojo, String> {
#Configuration
public class RedisConfiguration {
#Repository
public interface RedisRepository {
Please note, some of the repositories are not children of CrudRepository. There is no direct ___Repository which can cover everything.
And this first version is working fine. Very happy, meaning I am able to save the pojo to where it should be saved, as I am getting the correct repository bean, using this if else structure.
In my opinion, this structure is not very elegant (if it ok if we have different opinion here), especially, not flexible at all (need to hardcode each and every possible repository, again imagine many).
This is why I went to refactor and change to this second version:
#RestController
public class ControllerVersionTwo {
private ElasticRepository elasticRepository;
private MongoDbRepository mongoRepository;
private RedisRepository redisRepository;
private Map<String, Function<MyPojo, MyPojo>> designPattern;
#Autowired
public ControllerVersionTwo(ElasticRepository elasticRepository, MongoDbRepository mongoRepository, RedisRepository redisRepository) {
this.elasticRepository = elasticRepository;
this.mongoRepository = mongoRepository;
this.redisRepository = redisRepository;
// many more repositories
designPattern = new HashMap<>();
designPattern.put("elastic", myPojo -> elasticRepository.save(myPojo));
designPattern.put("mongo", myPojo -> mongoRepository.save(myPojo));
designPattern.put("redis", myPojo -> redisRepository.save(myPojo));
//many more put
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return designPattern.get(whereToSave).apply(myPojo).toString();
}
As you can see, I am leveraging a design pattern refactoring the if-else into a hashmap.
This post is not about if-else vs hashmap by the way.
Working fine, but please note, the map is a Map<String, Function<MyPojo, MyPojo>>, as I cannot construct a map of Map<String, #Repository>.
With this second version, the if-else is being refactored, but again, we need to hardcode the hashmap.
This is why I am having the idea to build a third version, where I can configure the map itself, via a spring boot property #Value for Map:
Here is what I tried:
#RestController
public class ControllerVersionThree {
#Value("#{${configuration.design.pattern.map}}")
Map<String, String> configurationDesignPatternMap;
private Map<String, Function<MyPojo, MyPojo>> designPatternStrategy;
public ControllerVersionThree() {
convertConfigurationDesignPatternMapToDesignPatternStrategy(configurationDesignPatternMap, designPatternStrategy);
}
private void convertConfigurationDesignPatternMapToDesignPatternStrategy(Map<String, String> configurationDesignPatternMap, Map<String, Function<MyPojo, MyPojo>> designPatternStrategy) {
// convert configurationDesignPatternMap
// {elastic:ElasticRepository, mongo:MongoDbRepository , redis:RedisRepository , ...}
// to a map where I can directly get the appropriate repository based on the key
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return designPatternStrategy.get(whereToSave).apply(myPojo).toString();
}
And I would configure in the property file:
configuration.design.pattern.map={elastic:ElasticRepository, mongo:MongoDbRepository , saveToRedis:RedisRepositry, redis:RedisRepository , ...}
And tomorrow, I would be able to configure add or remove the future repository target.
configuration.design.pattern.map={elastic:ElasticRepository, anotherElasticKeyForSameElasticRepository, redis:RedisRepository , postgre:PostGreRepository}
Unfortunately, I am stuck.
What is the correct code in order to leverage a configurable property for mapping a key with it's "which #Repository to use" please?
Thank you for your help.
You can create a base repository to be extended by all your repositories:
public interface BaseRepository {
MyPojo save(MyPojo onboarding);
}
so you will have a bunch of repositories like:
#Repository("repoA")
public interface ARepository extends JpaRepository<MyPojo, String>, BaseRepository {
}
#Repository("repoB")
public interface BRepository extends JpaRepository<MyPojo, String>, BaseRepository {
}
...
Those repositories will be provided by a factory:
public interface BaseRepositoryFactory {
BaseRepository getBaseRepository(String whereToSave);
}
that you must configure in a ServiceLocatorFactoryBean:
#Bean
public ServiceLocatorFactoryBean baseRepositoryBean() {
ServiceLocatorFactoryBean serviceLocatorFactoryBean = new ServiceLocatorFactoryBean();
serviceLocatorFactoryBean.setServiceLocatorInterface(BaseRepositoryFactory.class);
return serviceLocatorFactoryBean;
}
Now you can inject the factory wherever you need and get the repo want:
#Autowired
private BaseRepositoryFactory baseRepositoryFactory;
...
baseRepositoryFactory.getBaseRepository("repoA").save(myPojo);
...
Hope it helps.
Short answer:
create a shared interface
create multiple sub-class of this interface (one per storage) using different spring component names
Use a map to deal with aliases
use Spring context to retrieve the right bean by alias (instead of creating a custom factory)
Now adding a new storage is only adding a new Repository classes with a name
Explanation:
As mentioned in the other answer you first need to define a common interface as you can't use the CrudRepository.save(...).
In my example I reuse the same signature as the save method to avoid re-implementing it in the sub-classes of CrudRepository.
public interface MyInterface<T> {
<S extends T> S save(S entity);
}
Redis Repository:
#Repository("redis") // Here is the name of the redis repo
public class RedisRepository implements MyInterface<MyPojo> {
#Override
public <S extends MyPojo> S save(S entity) {
entity.setValue(entity.getValue() + " saved by redis");
return entity;
}
}
For the other CrudRepository no need to provide an implementation:
#Repository("elastic") // Here is the name of the elastic repo
public interface ElasticRepository extends CrudRepository<MyPojo, String>, MyInterface<MyPojo> {
}
Create a configuration for your aliases in application.yml
configuration:
design:
pattern:
map:
redis: redis
saveToRedisPlease: redis
elastic: elastic
Create a custom properties to retrieve the map:
#Component
#ConfigurationProperties(prefix = "configuration.design.pattern")
public class PatternProperties {
private Map<String, String> map;
public String getRepoName(String alias) {
return map.get(alias);
}
public Map<String, String> getMap() {
return map;
}
public void setMap(Map<String, String> map) {
this.map = map;
}
}
Now create the version three of your repository with the injection of SpringContext:
#RestController
public class ControllerVersionThree {
private final ApplicationContext context;
private PatternProperties designPatternMap;
public ControllerVersionThree(ApplicationContext context,
PatternProperties designPatternMap) {
this.context = context;
this.designPatternMap = designPatternMap;
}
#PostMapping(path = "/save")
public String save(#RequestBody MyRequest myRequest) {
String whereToSave = myRequest.getWhereToSave();
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
String repoName = designPatternMap.getRepoName(whereToSave);
MyInterface<MyPojo> repo = context.getBean(repoName, MyInterface.class);
return repo.save(myPojo).toString();
}
}
You can check that this is working with a test:
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.web.client.TestRestTemplate;
import org.springframework.boot.test.web.server.LocalServerPort;
import org.springframework.http.HttpEntity;
import static org.junit.jupiter.api.Assertions.assertEquals;
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
class ControllerVersionThreeTest {
#LocalServerPort
private int port;
#Autowired
private TestRestTemplate restTemplate;
#Test
void testSaveByRedis() {
// Given: here 'redis' is the name of the spring beans
HttpEntity<MyRequest> request = new HttpEntity<>(new MyRequest("redis", "aValue"));
// When
String response = restTemplate.postForObject("http://localhost:" + port + "/save", request, String.class);
// Then
assertEquals("MyPojo{value='aValue saved by redis'}", response);
}
#Test
void testSaveByRedisAlias() {
// Given: here 'saveToRedisPlease' is an alias name of the spring beans
HttpEntity<MyRequest> request = new HttpEntity<>(new MyRequest("saveToRedisPlease", "aValue"));
// When
String response = restTemplate.postForObject("http://localhost:" + port + "/save", request, String.class);
// Then
assertEquals("MyPojo{value='aValue saved by redis'}", response);
}
}
Have you tried creating a configuration class to create your repository map
#Configuration
public class MyConfiguration {
#Bean
public Map repositoryMap() {
Map<String, ? extends Repository> repositoryMap = new HashMap<>();
repositoryMap.put('redis', new RedisRepository());
repositoryMap.put('mongo', new MongoRepository());
repositoryMap.put('elastic', new ElasticRepository());
return Collections.unmodifiableMap(repositoryMap);
}
}
Then you could have the following in your rest controller
#RestController
#Configuration
public class ControllerVersionFour {
#Autowired
private Map<String, ? extends Repository> repositoryMap;
#PostMapping(path = "/save/{dbname}")
public String save(#RequestBody MyRequest myRequest, #PathVariable("dbname") String dbname) {
MyPojo myPojo = new MyPojo(UUID.randomUUID().toString(), myRequest.getValue());
return repisitoryMap.get(dbname).save(myPojo);
}
It might be better to have the db as a path/query parameter instead of having it in the request body. That way you may or may not be able to just save the request body depending on your use case instead of creating another pojo.
This post may also be useful for autowiring a map

Create bean using spring xml

How to create xml bean for the below java class. I am using old spring version, from where i need to create a xml bean for the following "TestSample" class.
#Service
#EnableConfigurationProperties(SampleProperties.class)
public class TestSample{
#Autowired
public ClientService clientService;
#Autowired
public RestTemplate restTemp;
public Map<String, String> testing(String a, String b, String c) throws Exception {
Map<String, String> map = clientService.find(a, b, c);
System.out.println("**=="+map.get(0));
return map;
}
}
ClientService class.
#Service
#Slf4j
#DependsOn("restTemp")
public class ClientService {
public ClientService(
#Autowired final SampleProperties sampleProperties,
#Autowired(required = false) final ObjectMapper pObjectMapper) throws UnknownHostException {
}
//......
}

Spring Boot - inject static map from application.yml

I refered Spring Boot - inject map from application.yml for injecting map from application.yml file
My application.yml snippet is below
easy.app.pairMap:
test1: 'value1'
test2: 'value2'
Properties file is like below
#Component
#Configuration
#ConfigurationProperties("easy.app")
#EnableConfigurationProperties
public class TestProperties {
private Map<String, String> pairMap= new HashMap<String, String>();
public void setPairMap(Map<String, String> pairMap) {
this.pairMap= pairMap;
}
}
The above given code works .Map is not read from application.yml file when the 'pairMap' is set as static as below.
#Component
#Configuration
#ConfigurationProperties("easy.app")
#EnableConfigurationProperties
public class TestProperties {
private static Map<String, String> pairMap= new HashMap<String, String>();
public static void setPairMap(Map<String, String> pairMap) {
TestProperties .pairMap= pairMap;
}
}
PS : The issue is only when injecting map , but not on injecting string. Why is this behaviour?
ie the following injection of string in the following configuration works , but not the map injection
easy.app.key1: 'abc'
easy.app.pairMap:
test1: 'value1'
test2: 'value2'
Properties file like below
#Component
#Configuration
#ConfigurationProperties("easy.app")
#EnableConfigurationProperties
public class TestProperties {
private Map<String, String> pairMap= new HashMap<String, String>();
private static String key1;
public static void setPairMap(Map<String, String> pairMap) {
this.pairMap= pairMap;
}
public static void setKey1(String key1) {
TestProperties.key1= key1;
}
public String getKey1(){
return key1;
}
Fix with this:
easy:
app:
pairMap:
test1: value1
test2: value2
#CompileStatic
#Component
#EnableConfigurationProperties
class ConfigHolder {
#Value(value = '${easy.app.pairMap.test1}')
String test1Valse;
#Value(value = '${easy.app.pairMap.test2}')
String test2Valse;
}
#CompileStatic
#Configuration
#EnableConfigurationProperties
public class TestProperties {
#Autowired
ConfigHolder configHolder;
private Map<String, String> pairMap= new HashMap<String, String>();
public void setPairMap(Map<String, String> pairMap) {
if(pairMap != null && !pairMap.isNotEmpty()) {
this.pairMap = pairMap;
} else {
this.pairMap.put("test 1", ${configHolder.test1Valse});
this.pairMap.put("test 2", ${configHolder.test2Valse});
}
}
}

Spring Boot #Autowired Environment throws NullPointerException

I have a simple SpringBoot application in which i am using the Environment.class to access the properties under application.properties file. The Environment bean works when it is accessed in the main method of the Application.class
#Configuration
#EnableAutoConfiguration
#ComponentScan(basePackages = "com.cisco.sdp.cdx.consumers")
public class StreamingConsumerApplication {
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(StreamingConsumerApplication.class, args);
Environment env = context.getBean(Environment.class);
StreamingConsumerFactory factory = context.getBean(StreamingConsumerFactory.class);
StreamingConsumer streamingConsumer = factory.createStreamingConsumer(StreamType.valueOf(env.getRequiredProperty("streaming.application.type")));
streamingConsumer.consume();
}
}
When the same is used in a different class it throws NullPointerException. I tried annotating the class with #Configuration,#Component,#Repository,#Service annotations but did not work.
I tried #Autowired as well as #Resource annotations. But, it didn't work.
#Component
public class InventoryStreamingConsumer implements StreamingConsumer {
#Autowired
private Environment env;
#Autowired
private JavaSparkSessionSingleton sparksession;
#Autowired
private StreamingContext _CONTEXT;
private final Map<String, String> kafkaParams = new HashMap<String, String>();
#Override
public void consume() {
if(env == null) {
System.out.println("ENV is NULL");
}
System.out.println(env.getRequiredProperty("kafka.brokerlist"));
kafkaParams.put("metadata.broker.list", env.getRequiredProperty("kafka.brokerlist"));
Set<String> topics = Collections.singleton(env.getRequiredProperty("kafka.topic"));
// Unrelated code.
}
I tried following the answers provided in the below questions
Spring Boot - Environment #Autowired throws NullPointerException
Autowired Environment is null
I am looking for suggestions on solving the issue.
The #Configuration annotation is misused here for InventoryStreamingConsumer. Try #Component, #Repository or #Service.
UPDATE
Another misuse is
StreamingConsumer streamingConsumer = factory.createStreamingConsumer(StreamType.valueOf(env.getRequiredProperty("streaming.application.type")));
#Autowired or #Resource can only work in bean created by Spring. the streamingConsumer created by your StreamingConsumerFactory factory cannot use #Autowired for injection of its properties.
You should create an #Configuration class, to tell Spring to create streamingConsumer from your factory. Like this
#Configuration
public class ConsumerCreator {
#Autowired
StreamingConsumerFactory factory;
#Bean
public StreamingConsumer streamingConsumer() {
return factory.createStreamingConsumer(StreamType.valueOf(env.getRequiredProperty("streaming.application.type")));
}
}
And use no annotation for InventoryStreamingConsumer, meanwhile use
StreamingConsumer streamingConsumer = context.getBean(StreamingConsumer.class);
in your StreamingConsumerApplication.main() method instead to retrieve streamingConsumer
First, please annotate the main class with only #SpringBootApplication
#SpringBootApplication
public class StreamingConsumerApplication {
}
#ComponentScan is required if your packages are not within the same structure as main class with main class being outside the sub-package and inside parent package with every other class in the same or some sub-package of parent package.
Second, Please create a Configuration class and annotate it with #Configuration separately and define a #Bean there for StreamingConsumer streamingConsumer and than it can be #Autowired or injected in the InventoryStreamingConsumer class.
Third, where is the #Bean for JavaSparkSessionSingleton defined? Are you sure it can be AutoConfigured for injection
Fourth, InventoryStreamingConsumer can be a #Component, injecting Environment with #Autowiring will work once the above things are sorted.
Also, recommending to change your class to this for the sake depending on how consume() method is used.
#Component
public class InventoryStreamingConsumer implements StreamingConsumer {
private final Environment env;
private final JavaSparkSessionSingleton sparksession;
private final StreamingContext _CONTEXT;
private final Map<String, String> kafkaParams = new HashMap<String, String>();
#Autowired
public InventoryStreamingConsumer(Environment env, JavaSparkSessionSingleton sparkSession, StreamingContext context) {
this.env = env;
this.sparksession = sparkSession;
this._CONTEXT = context;
}
#Override
public void consume() {
if(env == null) {
System.out.println("ENV is NULL");
}
System.out.println(env.getRequiredProperty("kafka.brokerlist"));
kafkaParams.put("metadata.broker.list", env.getRequiredProperty("kafka.brokerlist"));
Set<String> topics = Collections.singleton(env.getRequiredProperty("kafka.topic"));
// Unrelated code.
}
I have similar issue but read the properties from different file and different location like common/jdbc.properties. I solved this issue by doing this:
import org.springframework.context.EnvironmentAware;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.core.env.Environment;
#Configuration
#PropertySource(value = {"classpath:common/jdbc.properties"})
public class ExternalConfig implements EnvironmentAware {
private Environment environment;
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Bean
public static PropertySourcesPlaceholderConfigurer propertyConfigInDev() {
return new PropertySourcesPlaceholderConfigurer();
}
public String getJdbcUrl() {
return environment.getProperty("jdbc.url");
}
}
Try adding
#PropertySource("classpath:application.properties")
on InventoryStreamingConsumer class
This is how am using it
#Configuration
#ComponentScan({ "com.spring.config" })
#EnableTransactionManagement
#PropertySource("classpath:application.properties")
public class HibernateConfiguration {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USERNAME = "db.username";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
#Autowired
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
dataSource.setUrl(env.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
dataSource.setUsername(env.getRequiredProperty(PROPERTY_NAME_DATABASE_USERNAME));
dataSource.setPassword(env.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
return dataSource;
}

Spring-Boot multi module project load property-file

I have a Spring-Boot-Application as a multimodule-Project in maven. The structure is as follows:
Parent-Project
|--MainApplication
|--Module1
|--ModuleN
In the MainApplication project there is the main() method class annotated with #SpringBootApplication and so on. This project has, as always, an application.properties file which is loaded automatically. So I can access the values with the #Value annotation
#Value("${myapp.api-key}")
private String apiKey;
Within my Module1 I want to use a properties file as well (called module1.properties), where the modules configuration is stored. This File will only be accessed and used in the module. But I cannot get it loaded. I tried it with #Configuration and #PropertySource but no luck.
#Configuration
#PropertySource(value = "classpath:module1.properties")
public class ConfigClass {
How can I load a properties file with Spring-Boot and access the values easily? Could not find a valid solution.
My Configuration
#Configuration
#PropertySource(value = "classpath:tmdb.properties")
public class TMDbConfig {
#Value("${moviedb.tmdb.api-key}")
private String apiKey;
public String getApiKey() {
return apiKey;
}
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
Calling the Config
#Component
public class TMDbWarper {
#Autowired
private TMDbConfig tmdbConfig;
private TmdbApi tmdbApi;
public TMDbWarper(){
tmdbApi = new TmdbApi(tmdbConfig.getApiKey());
}
I'm getting an NullPointerException in the constructor when I autowire the warper.
For field injection:
Fields are injected right after construction of a bean, before any config methods are invoked. Such a config field does not have to be public. Refer Autowired annotation for complete usage. Use constructor injection in this case like below:
#Component
public class TMDbWarper {
private TMDbConfig tmdbConfig;
private TmdbApi tmdbApi;
#Autowired
public TMDbWarper(final TMDbConfig tmdbConfig){
this.tmdbConfig = tmdbConfig;
tmdbApi = new TmdbApi(tmdbConfig.getApiKey());
}
(or)
Use #PostConstruct to initialise like below:
#Component
public class TMDbWarper {
#Autowired
private TMDbConfig tmdbConfig;
private TmdbApi tmdbApi;
#PostConstruct
public void init() {
// any initialisation method
tmdbConfig.getConfig();
}
Autowiring is performed just after the creation of the object(after calling the constructor via reflection). So NullPointerException is expected in your constructor as tmdbConfig field would be null during invocation of constructor
You may fix this by using the #PostConstruct callback method as shown below:
#Component
public class TMDbWarper {
#Autowired
private TMDbConfig tmdbConfig;
private TmdbApi tmdbApi;
public TMDbWarper() {
}
#PostConstruct
public void init() {
tmdbApi = new TmdbApi(tmdbConfig.getApiKey());
}
public TmdbApi getTmdbApi() {
return this.tmdbApi;
}
}
Rest of your configuration seems correct to me.
Hope this helps.
Here is a Spring Boot multi-module example where you can get properties in different module.
Let's say I have main application module, dataparse-module, datasave-module.
StartApp.java in application module:
#SpringBootApplication
public class StartApp {
public static void main(String[] args) {
SpringApplication.run(StartApp.class, args);
}
}
Configuration in dataparse-module. ParseConfig.java:
#Configuration
public class ParseConfig {
#Bean
public XmlParseService xmlParseService() {
return new XmlParseService();
}
}
XmlParseService.java:
#Service
public class XmlParseService {...}
Configuration in datasave-module. SaveConfig.java:
#Configuration
#EnableConfigurationProperties(ServiceProperties.class)
#Import(ParseConfig.class)//get beans from dataparse-module - in this case XmlParseService
public class SaveConfig {
#Bean
public SaveXmlService saveXmlService() {
return new SaveXmlService();
}
}
ServiceProperties.java:
#ConfigurationProperties("datasave")
public class ServiceProperties {
private String message;
public String getMessage() {
return message;
}
public void setMessage(String message) {
this.message = message;
}
}
application.properties in datasave-module in resource/config folder:
datasave.message=Multi-module Maven project!
threads.xml.number=5
file.location.on.disk=D:\temp\registry
Then in datasave-module you can use all your properties either through #Value.
SaveXmlService.java:
#Service
public class SaveXmlService {
#Autowired
XmlParseService xmlParseService;
#Value("${file.location.on.disk: none}")
private String fileLocation;
#Value("${threads.xml.number: 3}")
private int numberOfXmlThreads;
...
}
Or through ServiceProperties:
Service.java:
#Component
public class Service {
#Autowired
ServiceProperties serviceProperties;
public String message() {
return serviceProperties.getMessage();
}
}
I had this situation before, I noticed that the properties file was not copied to the jar.
I made the following to get it working:
In the resources folder, I have created a unique package, then stored my application.properties file inside it. e.g: com/company/project
In the configuration file e.g: TMDBConfig.java I have referenced the full path of my .properties file:
#Configuration
#PropertySource("classpath:/com/company/project/application.properties")
public class AwsConfig
Build and run, it will work like magic.
You could autowire and use the Enviornment bean to read the property
#Configuration
#PropertySource(value = "classpath:tmdb.properties")
public class TMDbConfig {
#Autowired
private Environment env;
public String getApiKey() {
return env.getRequiredProperty("moviedb.tmdb.api-key");
}
}
This should guarantee that property is read from the context when you invoke the getApiKey() method regardless of when the #Value expression is resolved by PropertySourcesPlaceholderConfigurer.

Categories

Resources