I want to use a yml configuration file in my project. I am using jackson-dataformat-yaml for parsing yml files. But I need to parse yml comments as well. I used the similar approach in python using ruamel yaml. How can I do the same in java?
Upd.
What for? Well, I wanted to make it possible to override my configuration options by using command line arguments. So, to generate description message for each option, I wanted to use my comments. Like this:
In my config.yml
# Define a source directory
src: '/foo/bar'
# Define a destination directory
dst: '/foo/baz'
So when you run your program with the --help flag, you'll see the following output:
Your program can be ran with the following options:
--src Define a source directory
--dst Define a destination directory
The main benefit in such a model is that you don't ever need to repeat the same statement twice, because they can be retrieved from the configuration file.
Basically, you have three layers of data:
Your configuration schema. This defines the values that are to be defined in the configuration file.
The configuration file itself, which describes the usual configuration on the current machine.
One-time switches, which override the usual configuration.
The descriptions of what each value does belong to the schema, not to the configuration file itself. Think about it: If someone edits the configuration file on their machine and changes the comments, your help output would suddenly show different descriptions.
My suggestion would be to add the descriptions to the schema. The schema is the Java class you load your YAML into. I am not sure why you are using Jackson, since it uses SnakeYaml as parser and SnakeYaml is perfectly able to deserialize into Java classes, but has more configuration options since it does not generalize over JSON and YAML like Jackson does.
Here's a general idea how to do it with SnakeYaml (beware, untested):
// ConfigParam.java
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface ConfigParam { String description(); }
// Configuration.java
public class Configuration {
#ConfigParam("Define a source directory")
String src;
#ConfigParam("Define a destination directory")
String dst;
}
// loading code
Yaml yaml = new Yaml(new Constructor(Configuration.class));
Configuration config = yaml.loadAs(input, Configuration.class);
// help generation code
System.out.println("Your program can be ran with the following options:")
for (Field field: Configuration.class.getFields()) {
ConfigParam ann = field.getAnnotation(ConfigParam.class);
if (ann != null) {
System.out.println(String.format("--%s %s", field.getName(), ann.description());
}
}
For mapping actual parameters to the configuration, you can also loop over class fields and map the parameters to the field names after having loaded the configuration (to replace the standard values with the given ones).
Related
We've got the following small system:
One Java-Project containing an annotation processor to generate java-code. Let's call it "A"
A second Java-Project that depends on "A" and uses its annotation processor. Let's call it "B"
using gradle
We now want to feed "A" a specific configuration file. E.g. for telling it that the generated classes should have a specific suffix or some given annotations. But since "A" is supposed to be a dependency for several other projects, this configuration file MUST be given in "B". Best case: in the resources folder. So this is the flow we were thinking about:
"B" gets built
This triggers annotation processing of "A"
"A" reads a configuration file that lays in "B"
The generated classes are in the build-folder of "B"
Everything works totally fine, except that "A" is not able to read the configuration file.
So the question is: How do I give any information from "B" to "A" so that "A" uses it in the build-process of "B"? A path to a configuration file would be already sufficient. Or is there any other way to configure an annotation processor from a depending project?
Edit:
This is an example annotation processor that lays in "A"
#SupportedAnnotationTypes("...")
public class BlocklyAnnotationProcessor extends AbstractProcessor {
#SneakyThrows
#Override
public synchronized void init(ProcessingEnvironment processingEnv) {
super.init(processingEnv);
// <-- THIS IS WHERE I NEED TO ACCESS THE CONFIG FILE
this.do.fancyStuff();
}
A configuration file might be a yaml. Pretty similar to application.yml as it is used in spring boot
processor:
suffix: Impl
This is the project structure
project_A
L src
L java
L AnnotationProcessor.java
project_B
L src
L java
L ClassToBeProcessedByA.java
L resources
L my_config_to_be_read_by_A.yml
I want to get the directories of the source files which are getting compiled after annotation processing while doing the annotation processing without relying on directory/build tool conventions.
public class MyProcessor extends AbstractProcessor {
#Override
public boolean process(Set<? extends TypeElement> annotations, RoundEnvironment roundEnv) {
// Get the directories with source files to be compiled here
// For example, I should be able to find all classes which I obtain via the following in the directories I want:
roundEnv.getElementsAnnotatedWith(MyAnnotation.class)
}
}
I have tried:
A relative path which I got with Paths.get("."). But not only would I still have to rely on directory/build tool conventions to get to the source directories, it also doesn't work when the build is started from anywhere else but the project root.
Checking all properties in System.getProperties() and System.getenv() to see if there is anything useful being set.
Checked javax.lang.model.element.TypeElement, javax.lang.model.type.TypeMirror and the corresponding utility methods to see if I can get an instance's location (Did I miss something?)
Checked various combinations of StandardJavaFileManager's methods, some of which suggest to return what I want, but returning either null/empty collections or throwing exceptions for module related parameters. Example: standardJavaFileManager.list(StandardLocation.SOURCE_PATH, "", Set.of(Kind.SOURCE), true).
// edit
I'm trying to create a functionality where classes can be picked up based on super-types. For that, I created an Annotation where I can specify these super-types. I now need to scan all source files to check for classes that are sub-types of X, for which I am using a library. Said library needs to be pointed to a directory containing the classes to scan, which is why I need the source folder.
In my Android Application I have an annotation processor which generates files using JavaPoet and places them under the package generated.schema.
The files are generating correctly. Whenever I use the generated file like so
GeneratedFile.someGeneratedMethod();
I get the following error:
error: package generated.schema does not exist.
But if I include the fully qualified class name instead of importing like so
generated.schema.GeneratedFile.someGeneratedMethod();
the code compiles and runs without any error.
I don't want to add complete package each time I am using GeneratedFile. I'm not sure what I did wrong, since I'm still learning to work with Annotation Processor.
Files generated by other libraries including Realm, DataBinding are all working correctly as expected.
File Generation :
using JavaPoet I run the following code.
if (roundEnvironment.processingOver()) {
for (TypeElement element : apiList) {
TypeSpec clazz = generateFile(element);
JavaFile.builder(NamespaceCreator.generateClassPackage(element), clazz)
.build()
.writeTo(filer);
}
}
NamespaceCreator.generateClassPackage(element) returns the package name for class i.e generated.schema.
While generating classes I was waiting for the last processing pass. the code generation encapsulated by
if (roundEnvironment.processingOver())
I was getting a warning because of this:
File for type 'generated.schema.GeneratedFile' created in the last round will not be subject to annotation processing.
I was aware of this warning before I posted the question, however I was willing to ignore further annotation processing on my generated files for simplicity of generating all files in one go.
Even though, after removing the last round/pass check from file generation I can correctly (with import) access the generated files without any error; I still don't understand how generating files throughout all rounds affects accessing files during build with import.
For that I will be posting a new question.
Problem:
I have 3 parts in the software:
Client A service
Client B service
Target C service
I want to connect to C from A and B
I wrote a library with following setup:
/src/main/java/pkg.../TargetConnector.java
/src/main/java/pkg.../TargetConfig.java
/src/main/resources/application-dev.properties
/src/main/resources/application-tst.properties
/src/main/resources/application-prd.properties
My clients A and B both have there own sources and properties:
/src/main/java/pkg.../Client{A/B}Service.java
/src/main/java/pkg.../Client{A/B}Config.java
/src/main/resources/application-dev.properties
/src/main/resources/application-tst.properties
/src/main/resources/application-prd.properties
The properties of the Connector contains some login info for the service e.g.
target.url=https://....
target.usr=blablabla
target.key=mySHAkey
which is used in the TargetConfig to preconfigure the Connector e.g.
#Value("target.url")
String url;
#Value("target.usr")
String usr;
#Value("target.key")
String key;
#Bean
public TargetConnector connector() {
return new TargetConnector(url, usr, key);
}
Now when I use the connector jar in the client I can find the configuration via packagescan. The connector class is loaded but the problem is that it does not load the properties files.
Research
I found that multiple property files cannot have the same name (e.g. clients application-{profile}.properties clashes with the one from the connector), so I tried to rename application-{profile}.properties of the targetConnector to application-connector-{profile}.properties.
The properties whoever still do not get loaded, (which makes sense since I do not have a e.g connector-dev profile but my profile is simply named dev).
Furthermore, even if I try to explicitly load one of the property files from the connector with:
#PropertySource({"classpath*:application-connector-dev.properties"})
it cannot be found
Question
My question is actually 3 tiered:
How can I load a property file in a dependency jar at all?
How can I load the profiled version of the property file if the the properties file has a different name than application.properties? e.g. application-connector.properties
How can i combine the answers from question 1 and 2 to load the profiled version of the property in the jar?
If further explanation is needed, please ask.
Answer
I went for an approach as given in the accepted answer.
I Just created 3 configs for the dev, tst, prd profiles containing the values needed and annotated the config files with the correct profiles.
You are using #Configuration annotated class. Maybe you can have one per profile. Here you are an example:
#Configuration
#Profile("profileA")
#PropertySource({"classpath:application-profileA.properties"})
public class ConfigurationProfileA{
#Value("${target.url}")
String url;
#Value("${target.usr}")
String usr;
#Value("${target.key}")
String key;
#Bean
public TargetConnector connector() {
return new TargetConnector(url, usr, key);
}
}
Do the same for profile B (maybe you can structure this better but the key points here are the annotation #Profile("") and #PropertySource(""))
Once you have your config class, Spring will use the Configuration class you want by just filling -spring.profiles.active=profileA (or the name of the profile you have written in the #Profile("") annotation)
I think there is a typo in this line #PropertySource({"classpath*:application-connector-dev.properties"})
Please check by removing the asterik.
In order to run with a specific profile, you can run with option -spring.profiles.active=dev for example
If you don’t run with a profile, it will load the default profile in application.properties that you don’t seem to have.
Furthermore, an advice would be to always have an application.properties and put in it the common properties and the default values that you would override in other properties files.
Other mistake is how you assign properties with #Value annotation, you need to use #Value("${PROPERTY_FROM_PROPERTIES_FILE}")
Using Spring Boot 1.3.0.RELEASE
I have a couple of yaml files that describe several instances of a program. I now want to parse all those files into a List<Program> (Map, whatever), so I can later on search for the most appropriate instance for a given criteria in all the programs.
I like the approach with #ConfigurationProperties a lot, and it works good enough for a single yaml-file, but I haven't found a way yet to read all files in a directory using that method.
Current approach working for a single file:
programs/program1.yml
name: Program 1
minDays: 4
maxDays: 6
can be read by
#Configuration
#ConfigurationProperties(locations = "classpath:programs/program1.yml", ignoreUnknownFields = false)
public class ProgramProperties {
private Program test; //Program is a POJO with all the fields in the yml.
//getters+setters
I tried changing the locations to an Array listing all of my files locations = {"classpath:programs/program1.yml", "classpath:programs/program2.yml"} as well as using locations = "classpath:programs/*.yml", but that still only loads the first file (array-approach) or nothing at all (wildcard-approach).
So, my question is, what is the best way in Spring Boot to load a bunch of yaml files in a classpath-directory and parse them into a (List of) POJO, so they can be autowired in a Controller? Do I need to use Snakeyaml directly, or is there an integrated mechanism that I just haven't found yet?
EDIT:
A working approach is doing it manually:
private static final Yaml yaml = new Yaml(new Constructor(Program.class));
private static final ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();
try {
for (Resource resource : resolver.getResources("/programs/*.yml")) {
Object data = yaml.load(resource.getInputStream());
programList.add((Program) data);
}
}
catch (IOException ioe) {
logger.error("failed to load resource", ioe);
}
In Spring, it is possible to load multiple configuration properties files using PropertySource annotation, but not YAML files. See section 26.6.4 in link below:
https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-external-config.html#boot-features-external-config-typesafe-configuration-properties
However, from your problem, it seems that you can configure all your programs in single YAML and then get all list of programs in a single list.
Sample YAML (all.yaml)
programs:
- name: A
min: 1
max: 2
- name: B
min: 3
max: 4
Config.java
#Configuration
#ConfigurationProperties(locations={"classpath:all.yaml"})
public class Config{
private List<Program> programs;
public void setPrograms(List<Program> programs) {
this.programs = programs;
}
public List<Program> getPrograms() {
return programs;
}
}
What I am currently doing, as far as I understood your question, is nearly the same.
I am having an application.yml and also profile-specific yml files, e.g. application-{profile}.yml in my src/main/resources.
In the application.yml I have defined the default profile key-values, which are partially overridden by the profile-specific yml files.
If you want to have a type-safe and well defined access of your YML key/values, then you can use the following approach:
#ConfigurationProperties
public class AppSettings {
String name; // has to be the same as the key in your yml file
// setters/getters
}
In your Spring-Boot config, you have to add the following annotations onto your config class:
#ComponentScan
#EnableAutoConfiguration
#EnableConfigurationProperties( value = { AppSettings.class, SomeOtherSettings.class } )
public class SpringContextConfig {
#Autowired
private AppSettings appSettings;
public void test() {
System.out.println(appSettings.getName());
}
}
The #Autowiring is also accessible from other Beans.
The other way around (without an extra separated and type-safe class, is to access the YML-values via #Value("${name}").
To bring it together in a short manner:
Yes, it is possible to use several YAML files for your application via Spring-profiles. You define your current active spring profile via command args, programmatically or via your system env (SPRING_PROFILES_ACTIVE=name1,name2).
Therefore you can have several application.yml files for each profile (see above).