In my Spring Boot project I use a default Jackson ObjectMapper. I'd like to add new ObjectMapper to the Spring Context and start using it at new places, but also keep the default one. Adding new #Bean definition will override the default ObjectMapper. How can I add new ObjectMapper Bean without overriding the former one?
Yes, #ConditionalOnMissingBean is [hard-impossible] to hack. With a simple trick (asian philosophy), we can circumvent the problem/make it no problem at all:
Wrap your (1+, auto configured, #ConditionalOnMissing...) bean in something else/custom/a "wrapper". (at the costs of: referring to 1+/thinking about the difference/more complexity)
Mentioned MappingJackson2HttpMessageConverter (auto-config here) has this (built-in) capability (& purpose) to map to multiple object mappers in terms of "http conversion".
So with a (generic, e.g. java.util.Map based) thing like:
class MyWrapper<K, V> {
final Map<K, V> map;
public MyWrapper(Map<K, V> map) {
this.map = map;
}
public Map<K, V> getMap() {
return map;
}
}
We can go wire it:
#Bean
MyWrapper<String, ObjectMapper> myStr2OMWrapper(/*ObjectMapper jacksonOm*/) {
return new MyWrapper(Map.of(
// DEFAULT, jacksonOm,
"foo", fooMapper(),
"bar", barMapper()
));
}
..where fooMapper() and barMapper() can refer to (static/instance) no-bean methods:
private static ObjectMapper fooMapper() {
return new ObjectMapper()
.configure(SerializationFeature.INDENT_OUTPUT, true) // just a demo...
.configure(SerializationFeature.WRAP_ROOT_VALUE, true); // configure/set as see fit...
}
private static ObjectMapper barMapper() {
return new ObjectMapper()
.configure(SerializationFeature.INDENT_OUTPUT, false) // just a demo...
.configure(SerializationFeature.WRAP_ROOT_VALUE, false); // configure/set more...
}
(Already) testing/using time:
package com.example.demo;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.context.SpringBootTest;
#SpringBootTest
class DemoAppTests {
#Autowired
MyWrapper<String, ObjectMapper> my;
#Autowired
ObjectMapper jacksonOM;
#Test
void contextLoads() {
System.err.println(jacksonOM);
Assertions.assertNotNull(jacksonOM);
my.getMap().entrySet().forEach(e -> {
System.err.println(e);
Assertions.assertNotNull(e.getValue());
});
}
}
Prints (e.g.)
...
com.fasterxml.jackson.databind.ObjectMapper#481b2f10
bar=com.fasterxml.jackson.databind.ObjectMapper#577bf0aa
foo=com.fasterxml.jackson.databind.ObjectMapper#7455dacb
...
Results:
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0
...
Sorry this test dosn't verify (individual) configuration, but only: (visually different) not null object mappers.
How to enable (multiple!) my.custom.jackson.* auto configuration, is a more complex question... (it is not as easy as e.g. my.custom.datasource.* config ;(
With:
#Bean
#Primary // ! for auto config, we need one primary (whether it is "spring.jackson" ... adjust;)
#ConfigurationProperties("spring.jackson")
JacksonProperties springJacksonProps() {
return new JacksonProperties();
}
#Bean
#ConfigurationProperties("foo.jackson")
JacksonProperties fooProps() {
return new JacksonProperties();
}
#Bean
#ConfigurationProperties("bar.jackson")
JacksonProperties barProps() {
return new JacksonProperties();
}
we can already load and differentiate (full blown) config like:
spring.jackson.locale=en_US
spring.jackson.time-zone=UTC
# ... all of spring.jackson #see org.springframework.boot.autoconfigure.jackson.JacksonProperties
foo.jackson.locale=en_US
foo.jackson.time-zone=PST
# ... just for demo purpose
bar.jackson.locale=de_DE
bar.jackson.time-zone=GMT+1
And also (no problem) pass them (props) to the according (static [foo|bar]Mapper) methods.... but then? (If you are good with it, you can stop reading here!:)
Unfortunately the according ("state of art") code (to wire JacksonProperties with "om builder") is not public (i.e. not extendable/pluggable).
Instead the auto configuration provides (if none defined/#ConditionalOnMissingBean):
a prototype Jackson2ObjectMapperBuilder bean, which (everytime when requested):
applies (i.e. receives) customization from all (known) Jackson2ObjectMapperBuilderCustomizer beans.
of which one (auto configured, order(0), package private) is the "standard" responsible for wiring (spring.jackson.* only) JacksonProperties to Jackson2ObjectMapperBuilder...
So the the simplest approach seems (up-to-date) to:
steel/adopt the code (not-/implementing Jackson2ObjectMapperBuilderCustomizer)
construct (from "stolen" + properties) according builders/mappers, as see fit.
e.g. (review+TEST before PROD!) non-interface, returns a Jackson2ObjectMapperBuilder, mimic the auto-configured, without applying (other) customizers/-ation:
// ...
import com.fasterxml.jackson.databind.Module; // !! not java.lang.Module ;)
// ...
private static class MyStolenCustomizer {
private final JacksonProperties jacksonProperties;
private final Collection<Module> modules;
// additionally need/want this:
private final ApplicationContext applicationContext;
// copy/adopt from spring-boot:
private static final Map<?, Boolean> FEATURE_DEFAULTS = Map.of(
SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false,
SerializationFeature.WRITE_DURATIONS_AS_TIMESTAMPS, false
);
public MyStolenCustomizer(
ApplicationContext applicationContext,
JacksonProperties jacksonProperties,
Collection<Module> modules
) {
this.applicationContext = applicationContext;
this.jacksonProperties = jacksonProperties;
this.modules = modules;
}
// changed method signature!!
public Jackson2ObjectMapperBuilder buildCustom() {
// mimic original (spring-boot) bean:
Jackson2ObjectMapperBuilder builder = new Jackson2ObjectMapperBuilder();
builder.applicationContext(applicationContext);
// without (additional!) customizers:
if (this.jacksonProperties.getDefaultPropertyInclusion() != null) {
builder.serializationInclusion(this.jacksonProperties.getDefaultPropertyInclusion());
}
if (this.jacksonProperties.getTimeZone() != null) {
builder.timeZone(this.jacksonProperties.getTimeZone());
}
configureFeatures(builder, FEATURE_DEFAULTS);
configureVisibility(builder, this.jacksonProperties.getVisibility());
configureFeatures(builder, this.jacksonProperties.getDeserialization());
configureFeatures(builder, this.jacksonProperties.getSerialization());
configureFeatures(builder, this.jacksonProperties.getMapper());
configureFeatures(builder, this.jacksonProperties.getParser());
configureFeatures(builder, this.jacksonProperties.getGenerator());
configureDateFormat(builder);
configurePropertyNamingStrategy(builder);
configureModules(builder);
configureLocale(builder);
configureDefaultLeniency(builder);
configureConstructorDetector(builder);
// custom api:
return builder; // ..alternatively: builder.build();
}
// ... rest as in https://github.com/spring-projects/spring-boot/blob/main/spring-boot-project/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jackson/JacksonAutoConfiguration.java#L223-L341
To wire modules, we can (hopefully, as originally) rely on:
#Autowired
ObjectProvider<com.fasterxml.jackson.databind.Module> modules
To initialize them like:
#Bean
MyStolenCustomizer fooCustomizer(ApplicationContext context, #Qualifier("fooProps") JacksonProperties fooProperties, ObjectProvider<Module> modules) {
return new MyStolenCustomizer(context, fooProperties, modules.stream().toList());
}
#Bean
MyStolenCustomizer barCustomizer(ApplicationContext context, #Qualifier("barProps") JacksonProperties barProperties, ObjectProvider<Module> modules) {
return new MyStolenCustomizer(context, barProperties, modules.stream().toList());
}
..and use them like:
#Bean
MyWrapper<String, Jackson2ObjectMapperBuilder> myStr2OMBuilderWrapper(
#Qualifier("fooCustomizer") MyStolenCustomizer fooCustomizer,
#Qualifier("barCustomizer") MyStolenCustomizer barCustomizer) {
return new MyWrapper(
Map.of(
"foo", fooCustomizer.buildCustom(),
"bar", barCustomizer.buildCustom()
)
);
}
...avoiding "double customization"/leaving JacksonAutoConfiguration enabled/intact/active.
Problem: time/updates(/external code)!
If you want just a default ObjectMapper to use, I wrote a small utility that has some static methods for serializing/deserializing JSON and it uses ObjectMapper inside. You don't have to inject any beans. just use the Util. Here is Javadoc for the JsonUtils class. It comes with the java Open Source MgntUtils library written and maintained by me. You can get it as Maven artifacts or in Github.
I, too, just faced a similar problem - I had already figured out how to make a new ObjectMapper bean, but I couldn't figure out, no matter what I did, how to keep that from Spring Boot's auto-configuration (so that it would continue to make the default one). In the end, I gave up and simply made the second bean (mimicking the default one), myself. I chose to name it, hopefully to avoid any collision, and to declare it #Primary, to be chosen as would have the default.
In either case, making an ObjectMapper is quite easy, as such:
#Bean("standardJsonObjectMapper") // named, though not necessary
#org.springframework.context.annotation.Primary // mimic default
public com.fasterxml.jackson.databind.ObjectMapper standardJsonObjectMapper() {
return
org.springframework.http.converter.json.Jackson2ObjectMapperBuilder
.json()
.build();
}
That builder has MANY functions available for customization (like failOnUnknownProperties(boolean) and featuresToEnable(Object...)) - just choose the ones you want, and off you go!
Related
I need to build mappings for classes (literally a Map<Class<?>, String>), which won't vary at runtime, and keeping things decoupled is a priority. Since I'm in a Spring application, I thought I'd use an annotation and ClassPathScanningCandidateComponentProvider more or less like so:
#Inherited
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.TYPE)
public #interface Mapping {
String value();
}
And:
public class MappingLookUp {
private static final Map<Class<?>, String> MAPPING_LOOK_UP;
static {
Map<Class<?>, String> lookUp = new HashMap<>();
ClassPathScanningCandidateComponentProvider scanningCandidateComponentProvider = new ClassPathScanningCandidateComponentProvider(false);
scanningCandidateComponentProvider.addIncludeFilter(new AnnotationTypeFilter(Mapping.class));
for (BeanDefinition beanDefinition : scanningCandidateComponentProvider.findCandidateComponents("blah")) {
Class<?> clazz;
try {
clazz = Class.forName(beanDefinition.getBeanClassName());
} catch (ClassNotFoundException e) {
throw new RuntimeException(e);
}
Mapping mapping = AnnotationUtils.getAnnotation(clazz, Mapping.class);
if (mapping == null) {
throw new IllegalStateException("This should never be null");
}
lookUp.put(clazz, mapping.value());
}
MAPPING_LOOK_UP = Collections.unmodifiableMap(lookUp);
}
public static String getMapping(Class<?> clazz) {
...
}
}
Although I believe this will work, this feels like:
a lot to put in a static initialization
a hacky use of the scanning component provider, even though it's commonly recommended for this purpose; BeanDefinition makes it sound like it's intended for finding Spring beans rather than general class definitions.
To be clear, the annotated values are data classes -- not Spring-managed beans -- so a BeanPostProcessor pattern doesn't fit, and indeed, that's why it feels awkward to use the scanning component provider that, to me, seems intended for discovery of Spring managed beans.
Is this the proper way to be implementing this pattern? Is it a proper application of the provider? Is there a feasible alternative without pulling in other classpath scanning implementations?
I will suggest this doesn't look like it is done in a very Spring-y way.
If I were to be doing this, I would utilize Spring's BeanPostProcessor or BeanFactoryPostProcessor. Both of these allow for introspection on all Bean's in Spring's BeanFactory, and would allow you to get away from the static-ness of your current setup, as the PostProcessors are just Spring Bean's themselves.
class MappingLookup implements BeanPostProcessor {
private final Map<Class<?>, String> lookup = new HashMap<>();
#Override
public Object postProcessAfterInitialization(Object bean, String beanName) {
// check bean's class for annotation...
// add to lookup map as necessary...
// make sure to return bean (javadoc explains why)
return bean;
}
public String getMapping(Class<?> clazz) {
// ...
}
// omitted other methods...
}
I asked a very similar question recently How to get list of Interfaces from #ComponentScan packages and finally implemented the first of suggested approaches.
You can see the code https://github.com/StanislavLapitsky/SpringSOAProxy see https://github.com/StanislavLapitsky/SpringSOAProxy/blob/master/core/src/main/java/org/proxysoa/spring/service/ProxyableScanRegistrar.java and of course initialization annotation https://github.com/StanislavLapitsky/SpringSOAProxy/blob/master/core/src/main/java/org/proxysoa/spring/annotation/ProxyableScan.java the key thing is to add #Import({ProxyableScanRegistrar.class})
The key code is
public class ProxyableScanRegistrar implements ImportBeanDefinitionRegistrar, EnvironmentAware {
private Environment environment;
#Override
public void setEnvironment(Environment environment) {
this.environment = environment;
}
#Override
public void registerBeanDefinitions(AnnotationMetadata metadata, BeanDefinitionRegistry registry) {
// Get the ProxyableScan annotation attributes
Map<String, Object> annotationAttributes = metadata.getAnnotationAttributes(ProxyableScan.class.getCanonicalName());
if (annotationAttributes != null) {
String[] basePackages = (String[]) annotationAttributes.get("value");
if (basePackages.length == 0) {
// If value attribute is not set, fallback to the package of the annotated class
basePackages = new String[]{((StandardAnnotationMetadata) metadata).getIntrospectedClass().getPackage().getName()};
}
I'm building a plugin for Jira. I want to add a caching-layer so I wanted to use the com.atlassian.cache.CacheManager. I have to inject this via an argument / setter.
Since I'm extending an other class I wanted to inject this via a setter, but for some reason it returns null all the time.
import com.atlassian.cache.Cache;
import com.atlassian.cache.CacheLoader;
import com.atlassian.cache.CacheManager;
import com.atlassian.cache.CacheSettingsBuilder;
public class Foo extends AbstractJiraContextProvider
{
private CacheManager cacheManager;
public void setCacheManager(CacheManager cacheManager) {
//It does not get past this function..
this.cacheManager = cacheManager;
}
#Override
public Map getContextMap(ApplicationUser user, JiraHelper jiraHelper) {
cache = this.cacheManager.getCache("bar");
}
}
I also tried this by doing the following:
public Foo(CacheManager cacheManager) {
this.cacheManager = cacheManager;
}
After that the plugin does nothing anymore. I do not get errors, but it just gives 0 output.
I used this for documentation: https://developer.atlassian.com/confdev/confluence-plugin-guide/writing-confluence-plugins/accessing-confluence-components-from-plugin-modules
And https://developer.atlassian.com/confdev/development-resources/confluence-developer-faq/how-do-i-cache-data-in-a-plugin#HowdoIcachedatainaplugin?-Instructions
Your question mentions JIRA, but the documentation links that you provide are for Confluence (and outdated).
If you're developing an add-on for a recent version of JIRA (7.2+) then injecting components is now handled by Atlassian Spring Scanner 2, so everything works with annotations.
If you follow the instructions listed here then you should be able to inject components via a constructor like so:
#Component
public class MyService {
private final IssueService issueService;
private final InternalComponent internalComponent;
#Inject
public MyService(#ComponentImport final IssueService issueService,final InternalComponent internalComponent) {
this.issueService = issueService;
this.internalComponent = internalComponent;
}
}
In the following Spring Java Config:
#Configuration
#EnableAutoConfiguration
#ComponentScan("my.package")
public class Config {
#Bean
public BasicBean basicBean1() {
return new BasicBean("1");
}
#Bean
public BasicBean basicBean2() {
return new BasicBean("2");
}
#Bean
public ComplexBean complexBeanByParameters(List<BasicBean> basicBeans) {
return new ComplexBean(basicBeans);
}
#Bean
public ComplexBean complexBeanByReferences() {
return new ComplexBean(Arrays.asList(basicBean1(), basicBean2()));
}
}
I can create two ComplexBeans using either parameter injection, which is elegant, but has shortcomings if a have a few other beans of BasicBean type and only want a few (the parameters can of course be of type BasicBean and enumerate by name the beans I'm interested of, but it could turn out to be a very long list, at least for arguments sake). In case I wish to reference the beans directly I might use the complexBeanByReferences style, which could be useful in case of ordering or some other consideration.
But say I want to use the complexBeanByReference style to reference the bean complexBeanByParameters, that is something along the line of:
#Bean
public ComplexBeanRegistry complexBeanRegistry() {
return new ComplexBeanRegistry(
Arrays.asList(
complexBeanByParameters(), // but this will not work!
complexBeanByReferences()
)
);
}
How would I reference complexBeanByParameters, without having to specify a list of dependencies to complexBeanRegistry? Which, the latter in all honesty should be completely oblivious of.
There is the option to just use
public ComplexBeanRegistry complexBeanRegistry(List<ComplexBeans> complexBeans) {...}
of course, but this might not be an option in certain cases, specifically when using the CacheConfigurer from spring-context. In this case the Java Config is intended to
create the beans
by implementing CacheConfigurer, override the default instances of the CacheManager and KeyGenerator beans.
The requirement to implement CacheConfigurer means I can't change the signature to use parameter injection.
So the question is, is there a way to reference complexBeanByParameters using the "direct" reference style?
Maybe you could reference it with separation by Qualifier:
#Bean
#Qualifier("complexBeanParam")
public ComplexBean complexBeanByParameters(List<BasicBean> basicBeans) {
return new ComplexBean(basicBeans);
}
#Bean
#Qualifier("complexBeanRef")
public ComplexBean complexBeanByReferences() {
return new ComplexBean(Arrays.asList(basicBean1(), basicBean2()));
}
and for example autowire:
#Autowired
#Qualifier("complexBeanParam")
private ComplexBean beanParam;
I want to re-create (new Object) a specific bean at Runtime (no restarting the server) upon some DB changes. This is how it looks -
#Component
public class TestClass {
#Autowired
private MyShop myShop; //to be refreshed at runtime bean
#PostConstruct //DB listeners
public void initializeListener() throws Exception {
//...
// code to get listeners config
//...
myShop.setListenersConfig(listenersConfig);
myShop.initialize();
}
public void restartListeners() {
myShop.shutdownListeners();
initializeListener();
}
}
This code does not run as myShop object is created by Spring as Singleton & its context does not get refreshed unless the server is restarted. How to refresh (create a new object) myShop ?
One bad way I can think of is to create new myShop object inside restartListeners() but that does not seem right to me.
In DefaultListableBeanFactory you have public method destroySingleton("beanName")so you can play with it, but you have to be aware that if your autowired your bean it will keep the same instance of the object that has been autowired in the first place, you can try something like this:
#RestController
public class MyRestController {
#Autowired
SampleBean sampleBean;
#Autowired
ApplicationContext context;
#Autowired
DefaultListableBeanFactory beanFactory;
#RequestMapping(value = "/ ")
#ResponseBody
public String showBean() throws Exception {
SampleBean contextBean = (SampleBean) context.getBean("sampleBean");
beanFactory.destroySingleton("sampleBean");
return "Compare beans " + sampleBean + "=="
+ contextBean;
//while sampleBean stays the same contextBean gets recreated in the context
}
}
It is not pretty but shows how you can approach it. If you were dealing with a controller rather than a component class, you could have an injection in method argument and it would also work, because Bean would not be recreated until needed inside the method, at least that's what it looks like. Interesting question would be who else has reference to the old Bean besides the object it has been autowired into in the first place,because it has been removed from the context, I wonder if it still exists or is garbage colected if released it in the controller above, if some other objects in the context had reference to it, above would cause problems.
We have the same use-case. As already mentioned one of the main issues with re-creating a bean during runtime is how to updating the references that have already been injected. This presents the main challenge.
To work around this issue I’ve used Java’s AtomicReference<> class. Instead of injecting the bean directly, I’ve wrapped it as an AtomicReference and then inject that. Because the object wrapped by the AtomicReference can be reset in a thread safe manner, I am able to use this to change the underlying object when a database change is detected. Below is an example config / usage of this pattern:
#Configuration
public class KafkaConfiguration {
private static final String KAFKA_SERVER_LIST = "kafka.server.list";
private static AtomicReference<String> serverList;
#Resource
MyService myService;
#PostConstruct
public void init() {
serverList = new AtomicReference<>(myService.getPropertyValue(KAFKA_SERVER_LIST));
}
// Just a helper method to check if the value for the server list has changed
// Not a big fan of the static usage but needed a way to compare the old / new values
public static boolean isRefreshNeeded() {
MyService service = Registry.getApplicationContext().getBean("myService", MyService.class);
String newServerList = service.getPropertyValue(KAFKA_SERVER_LIST);
// Arguably serverList does not need to be Atomic for this usage as this is executed
// on a single thread
if (!StringUtils.equals(serverList.get(), newServerList)) {
serverList.set(newServerList);
return true;
}
return false;
}
public ProducerFactory<String, String> kafkaProducerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.CLIENT_ID_CONFIG, "...");
// Here we are pulling the value for the serverList that has been set
// see the init() and isRefreshNeeded() methods above
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, serverList.get());
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
#Bean
#Lazy
public AtomicReference<KafkaTemplate<String, String>> kafkaTemplate() {
KafkaTemplate<String, String> template = new KafkaTemplate<>(kafkaProducerFactory());
AtomicReference<KafkaTemplate<String, String>> ref = new AtomicReference<>(template);
return ref;
}
}
I then inject the bean where needed, e.g.
public MyClass1 {
#Resource
AtomicReference<KafkaTemplate<String, String>> kafkaTemplate;
...
}
public MyClass2 {
#Resource
AtomicReference<KafkaTemplate<String, String>> kafkaTemplate;
...
}
In a separate class I run a scheduler thread that is started when the application context is started. The class looks something like this:
class Manager implements Runnable {
private ScheduledExecutorService scheduler;
public void start() {
scheduler = Executors.newSingleThreadScheduledExecutor();
scheduler.scheduleAtFixedRate(this, 0, 120, TimeUnit.SECONDS);
}
public void stop() {
scheduler.shutdownNow();
}
#Override
public void run() {
try {
if (KafkaConfiguration.isRefreshNeeded()) {
AtomicReference<KafkaTemplate<String, String>> kafkaTemplate =
(AtomicReference<KafkaTemplate<String, String>>) Registry.getApplicationContext().getBean("kafkaTemplate");
// Get new instance here. This will have the new value for the server list
// that was "refreshed"
KafkaConfiguration config = new KafkaConfiguration();
// The set here replaces the wrapped objet in a thread safe manner with the new bean
// and thus all injected instances now use the newly created object
kafkaTemplate.set(config.kafkaTemplate().get());
}
} catch (Exception e){
} finally {
}
}
}
I am still on the fence if this is something I would advocate doing as it does have a slight smell to it. But in limited and careful usage it does provide an alternate approach to the stated use-case. Please be aware that from a Kafka standpoint this code example will leave the old producer open. In reality one would need to properly do a flush() call on the old producer to close it. But that's not what the example is meant to demonstrate.
Is there any way to load a class marked with #ConfigurationProperties without using a Spring Context directly? Basically I want to reuse all the smart logic that Spring does but for a bean I manually instantiate outside of the Spring lifecycle.
I have a bean that loads happily in Spring (Boot) and I can inject this into my other Service beans:
#ConfigurationProperties(prefix="my")
public class MySettings {
String property1;
File property2;
}
See the spring docco for more info http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#boot-features-external-config-command-line-args
But now I need to access this bean from a class that is created outside of Spring (by Hibernate). The class is created so early in the app startup process that Spring Boot has not yet made the application context available through the classic lookup helper methods or roll-my-own static references.
So I instead want to do something like:
MySettings mySettings = new MySettings();
SpringPropertyLoadingMagicClass loader = new SpringPropertyLoadingMagicClass();
loader.populatePropertyValues(mySettings);
And have MySettings end up with all its values loaded, from the command line, system properties, app.properties, etc. Is there some class in Spring that does something like this or is it all too interwoven with the application context?
Obviously I could just load the Properties file myself, but I really want to keep Spring Boot's logic around using command line variables (e.g. --my.property1=xxx), or system variables, or application.properties or even a yaml file, as well as its logic around relaxed binding and type conversion (e.g. property2 is a File) so that it all works exactly the same as when used in the Spring context.
Possible or pipe dream?
Thanks for your help!
I had the same "issue".
Here is how I solved it in SpringBoot version 1.3.xxx and 1.4.1.
Let's say we have the following yaml configuration file:
foo:
apis:
-
name: Happy Api
path: /happyApi.json?v=bar
-
name: Grumpy Api
path: /grumpyApi.json?v=grrr
and we have the following ConfigurationProperties:
#ConfigurationProperties(prefix = "foo")
public class ApisProperties {
private List<ApiPath> apis = Lists.newArrayList();
public ApisProperties() {
}
public List<ApiPath> getApis() {
return apis;
}
public static class ApiPath {
private String name;
private String path;
public String getName() {
return name;
}
public void setName(final String aName) {
name = aName;
}
public String getPath() {
return path;
}
public void setPath(final String aPath) {
path = aPath;
}
}
}
Then, to do the "magic" things of Spring Boot programmatically (e.g. loading some properties in a static method), you can do:
private static ApisProperties apiProperties() {
try {
ClassPathResource resource;
resource = new ClassPathResource("/config/application.yml");
YamlPropertiesFactoryBean factoryBean;
factoryBean = new YamlPropertiesFactoryBean();
factoryBean.setSingleton(true); // optional depends on your use-case
factoryBean.setResources(resource);
Properties properties;
properties = factoryBean.getObject();
MutablePropertySources propertySources;
propertySources = new MutablePropertySources();
propertySources.addLast(new PropertiesPropertySource("apis", properties));
ApisProperties apisProperties;
apisProperties = new ApisProperties();
PropertiesConfigurationFactory<ApisProperties> configurationFactory;
configurationFactory = new PropertiesConfigurationFactory<>(apisProperties);
configurationFactory.setPropertySources(propertySources);
configurationFactory.setTargetName("foo"); // it's the same prefix as the one defined in the #ConfigurationProperties
configurationFactory.bindPropertiesToTarget();
return apisProperties; // apiProperties are fed with the values defined in the application.yaml
} catch (BindException e) {
throw new IllegalArgumentException(e);
}
}
Here's an update to ctranxuan's answer for Spring Boot 2.x. In our situation, we avoid spinning up a Spring context for unit tests, but do like to test our configuration classes (which is called AppConfig in this example, and its settings are prefixed by app):
public class AppConfigTest {
private static AppConfig config;
#BeforeClass
public static void init() {
YamlPropertiesFactoryBean factoryBean = new YamlPropertiesFactoryBean();
factoryBean.setResources(new ClassPathResource("application.yaml"));
Properties properties = factoryBean.getObject();
ConfigurationPropertySource propertySource = new MapConfigurationPropertySource(properties);
Binder binder = new Binder(propertySource);
config = binder.bind("app", AppConfig.class).get(); // same prefix as #ConfigurationProperties
}
}
The "magic" class you are looking for is PropertiesConfigurationFactory. But I would question your need for it - if you only need to bind once, then Spring should be able to do it for you, and if you have lifecycle issues it would be better to address those (in case they break something else).
This post is going into similar direction but extends the last answer with also validation and property placeholder resolutions.
Spring Boot Binder API support for #Value Annotations
#Value annotations in ConfigurationPropertys don't seem to bind properly though (at least if the referenced values are not part of the ConfigurationProperty's prefix namespace).
import org.springframework.boot.context.properties.bind.Binder
val binder = Binder.get(environment)
binder.bind(prefix, MySettings.class).get