Spring Boot application.properties lifecyle - java

I have a large Spring web application, dating back several years. I need to update it to Spring Boot (corporate requirement). I'm well on my way - it starts (!), although there are some issues with properties being injected, which causes the app to fail.
Specifically, there are three huge config files per profile, eg qa_config.properties, qa_ehcache.xml, qa_monitoring.properties. At present I only care about qa_config.properties, which I have renamed to Spring Boot's preferred name, application-qa.properties, and application-qa_monitoring.properties
The application has a number of classes annotated with #Named (from the javax.ws.rs-api ) which are loaded early - so early that I need to inject properties in the constuctor:
package com.domain.app;
import org.springframework.beans.factory.annotation.Value;
import javax.inject.Named;
#Named
public class Foo {
// Cant use #Value here, it is not yet in the context
protected String bar; // lives in application-qa.properties
protected String qux; // lives in application-qa_monitoring.properties
public Foo(#Value("${application.property.named.bar}") String bar,
#Value("${monitoring.property.named.qux}") String qux) {
this.bar = bar;
this.qux = qux;
doSomeWork();
}
}
Properties files:
#application-qa.properties
application.property.named.bar=something
and
#application-qa_monitoring.properties
monitoring.property.named.qux=another_thing
My problem: I want to have both application-qa.properties and application-qa_monitoring.properties in context as soon as possible, and before the #Named classes are loaded.
To achieve this, I am running the application with an active profile of qa, which successfully adds that set of properties into the context.
I added this line to the application.properties file, to ensure that the other properties are loaded:
spring.profiles.include=${spring.profiles.active}_monitoring.properties
When I run the Spring Boot app, the output tells me
The following profiles are active: qa_monitoring.properties,qa
When debugging the Foo class, the value of bar is correct
But, the value of qux is null.
Am I missing something about the order in which properties files are loaded? I would have thought that the include line in application.properties would be sufficient to "flatten" the two files very early on, if one is in context, so both should be available?
What I could do instead is just throw all the vars in the two properties files into one, the application-qa.properties but I'd like, if possible, to keep them seperate and as close to the original structure as possible.

Thanks to pvpkiran and Andy Brown.
My application.properties file should have read
spring.profiles.include=${spring.profiles.active}_monitoring
ie, just adding another profile, in this case qa_monitoring - Spring automagically adds the application- prefix and the .properties suffix

The issue you are having is because you use a literal value instead of a lookup key in the #Value annotation of your qux value.
Replace
public Foo(#Value("${application.property.named.bar}") String bar,
#Value("monitoring.property.named.qux") String qux) {
With
public Foo(#Value("${application.property.named.bar}") String bar,
#Value("${monitoring.property.named.qux}") String qux) {
And it should work.

Related

Spring conditionally require bean only if profile is active

I have a class which is disabled based on #Profile. I want to use it inside another class that is not conditional on the same profile:
#Component
#Profile("!local")
public class NotAlwaysExistingClass {
public void doThing() {
...
}
}
public class AlwaysExistingClass {
#Autowired(required=true)
NotAlwaysExistingClass notAlwaysExisting;
// Impossible for this to happen if profile is "local"
public void notAlwaysDoneThing() {
notAlwaysExisting.doThing();
}
...
}
I don't want to set the #Autowired(required=false) in all cases. Is it possible to disable the requirement only if a certain profile is active? I want to do this to make it more convenient to occasionally run the code locally, but without compromising the application or making major changes to the class structure.
I agree with #xerx593's #1 but you could also change that a little. You could extract an interface and make the class depending on it use it via an interface. Then you would have 2 beans that implement that interface and only available at a given time via #Profile selection. Remember #Autowired is by type by default.
Really this issue is similar (or the same) to having a couple of profiles for various needs of a datasource for example. In my projects, the local profile points to a local DB, the regular one points to some cloud db via env variables or whatever, and then I have a "cicd" profile for integration tests and those use a spun up H2 DB.
"Smart" (tricky) (?) approach:
NO-OP Bean/Profile ;)
Introduce an other "bean" (or "class"), which:
extends NotAlwaysExistingClass
takes #Profile("local") (so the logical complement of the "non-local" profile)
overrides doThing(), but with no-op/cheap/only logging code.
Done.
you don't need (further) refactorings
you can leave the required attribute (one of the profiles will always strike)
in "non-local" profile, you get the right bean
in "local" profile: nice logging/no-op :)

Spring Boot #Value annotation picks up key from application.properties, but does not use it at run time

The value market with #Value isn't getting passed during run time. My code looks like this:
import com.example.MyClass;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
#Configuration
public class MyConfig {
#Value("${api.key}")
private String apiKey;
#Bean
public MyClass getKey() {
return new MyClass(apiKey);
}
}
When I hover my mouse over ${api.key} in IntelliJ it shows the key. And if I replace ${api.key} with the key itself, everything works perfectly fine. So the IDE knows that ${api.key} is the key from application.properties, but on run time, it doesn't use it, and I get a 401 error (because the field is null). Even if I go to application.properties and use "Find usages" on the api.key, it shows the right place in the code where it is used.
If I run mvn spring-boot:run, everything executes perfectly like it should, except the api-key is left null. If I write the key as a string instead of using #Value, it reads it and everything works like it should. It seems to be centered around that #Value isn't passed, although Spring Boot works in every other regard.
I have exhausted all suggestions I've found online, and nothing works. It's hard to list all attempts I've made, but I've tried everything I've seen. Configuring the pom, project structure, run configurations, adding #PropertySource, settings, and more. I even created the whole project from scratch to see if something had gotten tangled up while I tried to solve it.
It recognizes the properties, it runs Spring, it recognizes the key, but it just doesn't use it.
Try the following approach:
#Configuration
public class MyConfig {
#Bean
public MyClass getKey(#Value("${api.key}") String apiKey) {
return new MyClass(apiKey);
}
}
Another (even better) option is using configuration properties:
#ConfigurationProperties(prefix="api")
public class ApiConfigProperties {
private String key;
// getters, setters, no-op constructor
}
#Configuration
#EnableConfigurationProperties(ApiConfigProperties.class)
public class MyConfig {
#Bean
public MyClass getKey(ApiConfigProperties config) {
return new MyClass(config.getKey());
}
}
I usually prefer the second approach because it allows (with a plugin) the generation of a "special" JSON file that the IDE can read and support autocompletion. Another benefit is that the second way is easily debuggable (place a breakpoint in the setter - can't be easier) and allows the defaults specified in one place.
As for your option suggested in the question, I usually treat #Configuration as a DSL that creates beans, so I don't maintain any state on it. Since Configuration has happened to be a spring bean, it allows the auto wiring, but it doesn't mean that you should use it.
I have faced a similar issue. I find that you can use the #Value annotation on a setter. This should work :
private String apiKey;
#Value("${api.key}")
public String setApiKey(String apiKey) {
this.apiKey = apiKey;
}
I cannot not explain why or how this work, so if someone has any further explanations feel free to edit my answer.

Overriding/Wrapping spring beans in java based config multiple times

I have a (web-)application that needs special configurations and/or extensions based on the customer using the application. I call these additions "plugins" and they are auto discovered by classpath scanning when the application starts. For extensions that is incredibly easy. Let's say I want to have a plugin which adds an API that prints "hello world" when the URL /myplugin/greet is called: I just create a #Controller annotated class with the according #RequestMapping, put this in a myplugin.jar, copy that on the classpath and that's it.
Problems come up when I want to change some defaults and especially if I want to do this multiple times. Let's say my core application has a config like this:
#Configuration
public class CoreConfiguration {
#Bean
public Set<String> availableModules() {
return Collections.singleton("core");
}
}
Now I have two plugins that don't know about each other (but they do know the CoreConfig), but they both want to add themselves to the list of available modules. How would I do that? If I only had a single plugin that wants to override the module list I could override the existing bean from CoreConfiguration, but with two plugins that becomes a problem. What I imagine is something like this:
#Configuration
public class FirstPluginConfiguration {
#Bean
public Set<String> availableModules(Set<String> availableModules) {
Set<String> extendedSet = new HashSet<>(availableModules);
extendedSet.add("FirstPlugin");
return extendedSet;
}
}
Of course a SecondPluginConfiguration would look nearly exactly like this, except that the Set is not extended by "FirstPlugin", but by "SecondPlugin". I tested it to check what would happen and spring will just never call the First/SecondPluginConfiguration "availableModules" methods but it does not show an error either.
Now of course in this case this could easily be solved by using a mutable Set in the CoreConfiguration and then autowiring and extending the set in the other configurations, but for example I also want to be able to add method interceptors to some beans. So for example I might have an interface CrashLogger which has a logCrash(Throwable t) method and in CoreConfiguration a ToFileCrashLogger is created that writes stack traces to files as the name suggests. Now a plugin could say that he also wants to get notified about crashes, for example the plugin wants to ADDITIONALLY send the stacktrace to someone by email. For that matter that plugin could wrap the CrashLogger configured by the CoreConfiguration and fire BOTH. A second plugin could wrap the wrapper again and do something totally different with the stacktrace and still call both of the other CrashLoggers.
The later does sound somewhat like AOP and if I'd just let ALL my beans be proxied (I did not test that) I could autowire them into my plugin configurations, cast them to org.springframework.aop.framework.Advised and then add advices that manipulate behaviour. However it does seem like a huge overkill to generate proxies for each and everyone of my beans just so that that plugin can potentially add one or two advices one one or two beans.

Configuring DropWizard Programmatically

I have essentially the same question as here but am hoping to get a less vague, more informative answer.
I'm looking for a way to configure DropWizard programmatically, or at the very least, to be able to tweak configs at runtime. Specifically I have a use case where I'd like to configure metrics in the YAML file to be published with a frequency of, say, 2 minutes. This would be the "normal" default. However, under certain circumstances, I may want to speed that up to, say, every 10 seconds, and then throttle it back to the normal/default.
How can I do this, and not just for the metrics.frequency property, but for any config that might be present inside the YAML config file?
Dropwizard reads the YAML config file and configures all the components only once on startup. Neither the YAML file nor the Configuration object is used ever again. That means there is no direct way to configure on run-time.
It also doesn't provide special interfaces/delegates where you can manipulate the components. However, you can access the objects of the components (usually; if not you can always send a pull request) and configure them manually as you see fit. You may need to read the source code a bit but it's usually easy to navigate.
In the case of metrics.frequency you can see that MetricsFactory class creates ScheduledReporterManager objects per metric type using the frequency setting and doesn't look like you can change them on runtime. But you can probably work around it somehow or even better, modify the code and send a Pull Request to dropwizard community.
Although this feature isn't supported out of the box by dropwizard, you're able to accomplish this fairly easy with the tools they give you. Note that the below solution definitely works on config values you've provided, but it may not work for built in configuration values.
Also note that this doesn't persist the updated config values to the config.yml. However, this would be easy enough to implement yourself simply by writing to the config file from the application. If anyone would like to write this implementation feel free to open a PR on the example project I've linked below.
Code
Start off with a minimal config:
config.yml
myConfigValue: "hello"
And it's corresponding configuration file:
ExampleConfiguration.java
public class ExampleConfiguration extends Configuration {
private String myConfigValue;
public String getMyConfigValue() {
return myConfigValue;
}
public void setMyConfigValue(String value) {
myConfigValue = value;
}
}
Then create a task which updates the config:
UpdateConfigTask.java
public class UpdateConfigTask extends Task {
ExampleConfiguration config;
public UpdateConfigTask(ExampleConfiguration config) {
super("updateconfig");
this.config = config;
}
#Override
public void execute(Map<String, List<String>> parameters, PrintWriter output) {
config.setMyConfigValue("goodbye");
}
}
Also for demonstration purposes, create a resource which allows you to get the config value:
ConfigResource.java
#Path("/config")
public class ConfigResource {
private final ExampleConfiguration config;
public ConfigResource(ExampleConfiguration config) {
this.config = config;
}
#GET
public Response handleGet() {
return Response.ok().entity(config.getMyConfigValue()).build();
}
}
Finally wire everything up in your application:
ExampleApplication.java (exerpt)
environment.jersey().register(new ConfigResource(configuration));
environment.admin().addTask(new UpdateConfigTask(configuration));
Usage
Start up the application then run:
$ curl 'http://localhost:8080/config'
hello
$ curl -X POST 'http://localhost:8081/tasks/updateconfig'
$ curl 'http://localhost:8080/config'
goodbye
How it works
This works simply by passing the same reference to the constructor of ConfigResource.java and UpdateConfigTask.java. If you aren't familiar with the concept see here:
Is Java "pass-by-reference" or "pass-by-value"?
The linked classes above are to a project I've created which demonstrates this as a complete solution. Here's a link to the project:
scottg489/dropwizard-runtime-config-example
Footnote: I haven't verified this works with the built in configuration. However, the dropwizard Configuration class which you need to extend for your own configuration does have various "setters" for internal configuration, but it may not be safe to update those outside of run().
Disclaimer: The project I've linked here was created by me.
I solved this with bytecode manipulation via Javassist
In my case, I wanted to change the "influx" reporter
and modifyInfluxDbReporterFactory should be ran BEFORE dropwizard starts
private static void modifyInfluxDbReporterFactory() throws Exception {
ClassPool cp = ClassPool.getDefault();
CtClass cc = cp.get("com.izettle.metrics.dw.InfluxDbReporterFactory"); // do NOT use InfluxDbReporterFactory.class.getName() as this will force the class into the classloader
CtMethod m = cc.getDeclaredMethod("setTags");
m.insertAfter(
"if (tags.get(\"cloud\") != null) tags.put(\"cloud_host\", tags.get(\"cloud\") + \"_\" + host);tags.put(\"app\", \"sam\");");
cc.toClass();
}

Possible to have an ApplicationScoped bean that skins a JSF 2 application with a richfaces skin?

In production want the user to be able to write a properties file and upload that file to our production server. Once this is in place it will contain the properties needed for a richfaces skin. This file can be named whatever.
In development I want the properties file to be read from inside WEB-INF/myprop.properties where all my other properties files are. This file can be named whatever.
So far I have done this:
#ManagedBean(name="myCustomSkin ")
#ApplicationScoped
public class MyCustomSkin extends SkinFactoryImpl {
/*
* In here I call
* Skin s = this.buildSkin(context, "skin"); in my constructor
* I am also overriding the loadProperties() method so it load my properties just fine
* For some reason I can't get my app to actually use the properties I have loaded
*/
}
Any ideas? Basically I want to dynamically skin my richfaces application via an ApplicationScoped ManagedBean that gets initialized on startup of my Tomcat 7 server. Ideally the name of the skin file would be dynamic user input possibly read from the database or a different properties file.
EDIT 1: I have gotten it to load the properties file and then I manually (through java code) tried to insert a context-param by using servletContext.setInitParameter("org.richfaces.skin", this.skin); where this.skin is a String variable which I get the value for like this: this.skin = s.getName(); and if you see above s is just the Skin object I get back from this.buildSkin. This comes to a SkinNotFound Exception because I am assuming my skin isn't getting place in my class path or something.
EDIT 2: Using JSF 2.1.17, Richfaces 4.3.0, and Tomcat7
EDIT 3: Is there a way to tell richfaces to look in a different directory for the myskin.skin.properties file?
Thanks
To implement something like this RichFaces source code is your best friend.
Here is what I have found how you can do what you want:
Add a file to a jar (or anywhere so it will appear in classpath)
META-INF/services/org.richfaces.application.Module
Content of the file should be com.example.CustomModule
Implementation of the custom module can be like this:
public class CustomModule implements Module {
public void configure(ServicesFactory factory) {
factory.setInstance(SkinFactory.class, new CustomSkinFactoryImpl());
}
}
And then implement SkinFactory according to your needs, for example (if you want to extend default behavior with your CustomSkin):
public class CustomSkinFactoryImpl extends SkinFactoryImpl {
public Skin getSkin(FacesContext context) {
return new CompositeSkinImpl(new CustomSkin(), super.getSkin(context));
}
}

Categories

Resources