How to log the active configuration in a Spring Boot application? - java

I would really like to use YAML config for Spring Boot, as I find it quite readable and useful to have a single file showing what properties are active in my different profiles. Unfortunately, I'm finding that setting properties in application.yml can be rather fragile.
Things like using a tab instead of spaces will cause properties to not exist (without warnings as far as I can see), and all too often I find that my active profiles are not being set, due to some unknown issue with my YAML.
So I was wondering whether there are any hooks that would enable me to get hold of the currently active profiles and properties, so that I could log them.
Similarly, is there a way to cause start-up to fail if the application.yml contains errors? Either that or a means for me to validate the YAML myself, so that I could kill the start-up process.

In addition to other answers: logging active properties on context refreshed event.
Java 8
package mypackage;
import lombok.extern.slf4j.Slf4j;
import org.springframework.context.event.ContextRefreshedEvent;
import org.springframework.context.event.EventListener;
import org.springframework.core.env.ConfigurableEnvironment;
import org.springframework.core.env.MapPropertySource;
import org.springframework.stereotype.Component;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
#Slf4j
#Component
public class AppContextEventListener {
#EventListener
public void handleContextRefreshed(ContextRefreshedEvent event) {
printActiveProperties((ConfigurableEnvironment) event.getApplicationContext().getEnvironment());
}
private void printActiveProperties(ConfigurableEnvironment env) {
System.out.println("************************* ACTIVE APP PROPERTIES ******************************");
List<MapPropertySource> propertySources = new ArrayList<>();
env.getPropertySources().forEach(it -> {
if (it instanceof MapPropertySource && it.getName().contains("applicationConfig")) {
propertySources.add((MapPropertySource) it);
}
});
propertySources.stream()
.map(propertySource -> propertySource.getSource().keySet())
.flatMap(Collection::stream)
.distinct()
.sorted()
.forEach(key -> {
try {
System.out.println(key + "=" + env.getProperty(key));
} catch (Exception e) {
log.warn("{} -> {}", key, e.getMessage());
}
});
System.out.println("******************************************************************************");
}
}
Kotlin
package mypackage
import mu.KLogging
import org.springframework.context.event.ContextRefreshedEvent
import org.springframework.context.event.EventListener
import org.springframework.core.env.ConfigurableEnvironment
import org.springframework.core.env.MapPropertySource
import org.springframework.stereotype.Component
#Component
class AppContextEventListener {
companion object : KLogging()
#EventListener
fun handleContextRefreshed(event: ContextRefreshedEvent) {
printActiveProperties(event.applicationContext.environment as ConfigurableEnvironment)
}
fun printActiveProperties(env: ConfigurableEnvironment) {
println("************************* ACTIVE APP PROPERTIES ******************************")
env.propertySources
.filter { it.name.contains("applicationConfig") }
.map { it as EnumerablePropertySource<*> }
.map { it -> it.propertyNames.toList() }
.flatMap { it }
.distinctBy { it }
.sortedBy { it }
.forEach { it ->
try {
println("$it=${env.getProperty(it)}")
} catch (e: Exception) {
logger.warn("$it -> ${e.message}")
}
}
println("******************************************************************************")
}
}
Output like:
************************* ACTIVE APP PROPERTIES ******************************
server.port=3000
spring.application.name=my-app
...
2017-12-29 13:13:32.843 WARN 36252 --- [ main] m.AppContextEventListener : spring.boot.admin.client.service-url -> Could not resolve placeholder 'management.address' in value "http://${management.address}:${server.port}"
...
spring.datasource.password=
spring.datasource.url=jdbc:postgresql://localhost/my_db?currentSchema=public
spring.datasource.username=db_user
...
******************************************************************************

I had the same problem, and wish there was a debug flag that would tell the profile processing system to spit out some useful logging. One possible way of doing it would be to register an event listener for your application context, and print out the profiles from the environment. I haven't tried doing it this way myself, so your mileage may vary. I think maybe something like what's outlined here:
How to add a hook to the application context initialization event?
Then you'd do something like this in your listener:
System.out.println("Active profiles: " + Arrays.toString(ctxt.getEnvironment().getActiveProfiles()));
Might be worth a try. Another way you could probably do it would be to declare the Environment to be injected in the code where you need to print the profiles. I.e.:
#Component
public class SomeClass {
#Autowired
private Environment env;
...
private void dumpProfiles() {
// Print whatever needed from env here
}
}

Actuator /env service displays properties, but it doesn't displays which property value is actually active. Very often you may want to override your application properties with
profile-specific application properties
command line arguments
OS environment variables
Thus you will have the same property and different values in several sources.
Snippet bellow prints active application properties values on startup:
#Configuration
public class PropertiesLogger {
private static final Logger log = LoggerFactory.getLogger(PropertiesLogger.class);
#Autowired
private AbstractEnvironment environment;
#PostConstruct
public void printProperties() {
log.info("**** APPLICATION PROPERTIES SOURCES ****");
Set<String> properties = new TreeSet<>();
for (PropertiesPropertySource p : findPropertiesPropertySources()) {
log.info(p.toString());
properties.addAll(Arrays.asList(p.getPropertyNames()));
}
log.info("**** APPLICATION PROPERTIES VALUES ****");
print(properties);
}
private List<PropertiesPropertySource> findPropertiesPropertySources() {
List<PropertiesPropertySource> propertiesPropertySources = new LinkedList<>();
for (PropertySource<?> propertySource : environment.getPropertySources()) {
if (propertySource instanceof PropertiesPropertySource) {
propertiesPropertySources.add((PropertiesPropertySource) propertySource);
}
}
return propertiesPropertySources;
}
private void print(Set<String> properties) {
for (String propertyName : properties) {
log.info("{}={}", propertyName, environment.getProperty(propertyName));
}
}
}

If application.yml contains errors it will cause a failure on startup. I guess it depends what you mean by "error" though. Certainly it will fail if the YAML is not well formed. Also if you are setting #ConfigurationProperties that are marked as ignoreInvalidFields=true for instance, or if you set a value that cannot be converted. That's a pretty wide range of errors.
The active profiles will probably be logged on startup by the Environment implementation (but in any case it's easy for you to grab that and log it in your launcher code - the toString() of teh Environment will list the active profiles I think). Active profiles (and more) are also available in the /env endpoint if you add the Actuator.

In case you want to get the active profiles before initializing the beans/application, the only way I found is registering a custom Banner in your SpringBootServletInitializer/SpringApplication (i.e. ApplicationWebXml in a JHipster application).
e.g.
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder builder)
{
// set a default to use when no profile is configured.
DefaultProfileUtil.addDefaultProfile(builder.application());
return builder.sources(MyApp.class).banner(this::printBanner);
}
/** Custom 'banner' to obtain early access to the Spring configuration to validate and debug it. */
private void printBanner(Environment env, Class<?> sourceClass, PrintStream out)
{
if (env.getProperty("spring.datasource.url") == null)
{
throw new RuntimeException(
"'spring.datasource.url' is not configured! Check your configuration files and the value of 'spring.profiles.active' in your launcher.");
}
...
}

Related

Broadleaf Commerce Embedded Solr cannot run with root user

I download a fresh 6.1 broadleaf-commerce and run my local machine via java -javaagent:./admin/target/agents/spring-instrument.jar -jar admin/target/admin.jar successfully on mine macbook. But in my centos 7 I run sudo java -javaagent:./admin/target/agents/spring-instrument.jar -jar admin/target/admin.jar with following error
2020-10-12 13:20:10.838 INFO 2481 --- [ main] c.b.solr.autoconfigure.SolrServer : Syncing solr config file: jar:file:/home/mynewuser/seafood-broadleaf/admin/target/admin.jar!/BOOT-INF/lib/broadleaf-boot-starter-solr-2.2.1-GA.jar!/solr/standalone/solrhome/configsets/fulfillment_order/conf/solrconfig.xml to: /tmp/solr-7.7.2/solr-7.7.2/server/solr/configsets/fulfillment_order/conf/solrconfig.xml
*** [WARN] *** Your Max Processes Limit is currently 62383.
It should be set to 65000 to avoid operational disruption.
If you no longer wish to see this warning, set SOLR_ULIMIT_CHECKS to false in your profile or solr.in.sh
WARNING: Starting Solr as the root user is a security risk and not considered best practice. Exiting.
Please consult the Reference Guide. To override this check, start with argument '-force'
2020-10-12 13:20:11.021 ERROR 2481 --- [ main] c.b.solr.autoconfigure.SolrServer : Problem starting Solr
Here is the source code of solr configuration, I believe it is the place to change the configuration to run with the argument -force in programming way.
package com.community.core.config;
import org.apache.solr.client.solrj.SolrClient;
import org.apache.solr.client.solrj.impl.HttpSolrClient;
import org.broadleafcommerce.core.search.service.SearchService;
import org.broadleafcommerce.core.search.service.solr.SolrConfiguration;
import org.broadleafcommerce.core.search.service.solr.SolrSearchServiceImpl;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.stereotype.Component;
/**
*
*
* #author Phillip Verheyden (phillipuniverse)
*/
#Component
public class ApplicationSolrConfiguration {
#Value("${solr.url.primary}")
protected String primaryCatalogSolrUrl;
#Value("${solr.url.reindex}")
protected String reindexCatalogSolrUrl;
#Value("${solr.url.admin}")
protected String adminCatalogSolrUrl;
#Bean
public SolrClient primaryCatalogSolrClient() {
return new HttpSolrClient.Builder(primaryCatalogSolrUrl).build();
}
#Bean
public SolrClient reindexCatalogSolrClient() {
return new HttpSolrClient.Builder(reindexCatalogSolrUrl).build();
}
#Bean
public SolrClient adminCatalogSolrClient() {
return new HttpSolrClient.Builder(adminCatalogSolrUrl).build();
}
#Bean
public SolrConfiguration blCatalogSolrConfiguration() throws IllegalStateException {
return new SolrConfiguration(primaryCatalogSolrClient(), reindexCatalogSolrClient(), adminCatalogSolrClient());
}
#Bean
protected SearchService blSearchService() {
return new SolrSearchServiceImpl();
}
}
Let me preface this by saying you would be better off simply not starting the application as root. If you are in Docker, you can use the USER command to switch to a non-root user.
The Solr server startup in Broadleaf Community is done programmatically via the broadleaf-boot-starter-solr dependency. This is the wrapper around Solr that ties it to the Spring lifecycle. All of the real magic happens in the com.broadleafcommerce.solr.autoconfigure.SolrServer class.
In that class, you will see a startSolr() method. This method is what adds startup arguments to Solr.
In your case, you will need to mostly copy this method wholesale and use cmdLine.addArgument(...) to add additional arguments. Example:
class ForceStartupSolrServer extends SolrServer {
public ForceStartupSolrServer(SolrProperties props) {
super(props);
}
protected void startSolr() {
if (!isRunning()) {
if (!downloadSolrIfApplicable()) {
throw new IllegalStateException("Could not download or expand Solr, see previous logs for more information");
}
stopSolr();
synchConfig();
{
CommandLine cmdLine = new CommandLine(getSolrCommand());
cmdLine.addArgument("start");
cmdLine.addArgument("-p");
cmdLine.addArgument(Integer.toString(props.getPort()));
// START MODIFICATION
cmdLine.addArgument("-force");
// END MODIFICATION
Executor executor = new DefaultExecutor();
PumpStreamHandler streamHandler = new PumpStreamHandler(System.out);
streamHandler.setStopTimeout(1000);
executor.setStreamHandler(streamHandler);
try {
executor.execute(cmdLine);
created = true;
checkCoreStatus();
} catch (IOException e) {
LOG.error("Problem starting Solr", e);
}
}
}
}
}
Then create an #Configuration class to override the blAutoSolrServer bean created by SolrAutoConfiguration (note the specific package requirement for org.broadleafoverrides.config):
package org.broadleafoverrides.config;
public class OverrideConfiguration {
#Bean
public ForceStartupSolrServer blAutoSolrServer(SolrProperties props) {
return new ForceStartupSolrServer(props);
}
}

When my Spring app runs, it isn't using my TogglzConfig file

I have a large Spring application that is set up without XML using only annotations. I have made some changes to this application and have a separate project with what should be almost all the same code. However, in this separate project, Togglz seems to be using some sort of default config instead of the TogglzConfig file I've set up.
The first sign that something was wrong was when I couldn't access the Togglz console. I get a 403 Forbidden error despite my config being set to allow anyone to use it (as shown on the Togglz site). I then did some tests and tried to see a list of features and the list is empty when I call FeatureContext.getFeatureManager().getFeatures() despite my Feature class having several features included. This is why I think it's using some sort of default.
TogglzConfiguration.java
public enum Features implements Feature {
FEATURE1,
FEATURE2,
FEATURE3,
FEATURE4,
FEATURE5;
public boolean isActive() {
return FeatureContext.getFeatureManager().isActive(this);
}
}
TogglzConfiguration.java
#Component
public class TogglzConfiguration implements TogglzConfig {
public Class<? extends Feature> getFeatureClass() {
return Features.class;
}
public StateRepository getStateRepository() {
File properties = [internal call to property file];
try {
return new FileBasedStateRepository(properties);
} catch (Exception e) {
throw new TogglzConfigException("Error getting Togglz configuration from " + properties + ".", e);
}
}
#Override
public UserProvider getUserProvider() {
return new UserProvider() {
#Override
public FeatureUser getCurrentUser() {
return new SimpleFeatureUser("admin", true);
}
};
}
}
SpringConfiguration.java
#EnableTransactionManagement
#Configuration
#ComponentScan(basePackages = { "root package for the entire project" }, excludeFilters =
#ComponentScan.Filter(type=FilterType.ANNOTATION, value=Controller.class))
public class SpringConfiguration {
#Bean
public TransformerFactory transformerFactory() {
return TransformerFactory.newInstance();
}
#Bean
public DocumentBuilderFactory documentBuilderfactory() {
return DocumentBuilderFactory.newInstance();
}
#Bean
public RestTemplate restTemplate() {
return new RestTemplate();
}
}
My project finds a bunch of other beans set up with the #Component annotation. I don't know if the problem is that this component isn't being picked up at all or if Togglz simply isn't using it for some reason. I tried printing the name of the FeatureManager returned by FeatureContext.getFeaturemanager() and it is FallbackTestFeatureManager so this seems to confirm my suspicion that it's just not using my config at all.
Anyone have any ideas on what I can check? I'm flat out of ideas, especially since this is working with an almost completely the same IntelliJ project on my machine right now. I just can't find out what's different about the Togglz setup or the Spring configurations. Thanks in advance for your help.
I finally had my light bulb moment and solved this problem. In case anyone else has a similar issue, it seems my mistake was having the Togglz testing and JUnit dependencies added to my project but not limiting them to the test scope. I overlooked that part of the site.
<!-- Togglz testing support -->
<dependency>
<groupId>org.togglz</groupId>
<artifactId>togglz-testing</artifactId>
<version>2.5.0.Final</version>
<scope>test</scope>
</dependency>
Without that scope, I assume these were overriding the Togglz configuration I created with a default test configuration and that was causing my issue.

Environment Specific application.properties file in Spring non-Boot application

I'd like to accomplish this: Environment Specific application.properties file in Spring Boot application
in a Spring non-Boot application. Any idea on how to do that? Now I am setting environment variables to tell the application which properties to use, would prefer to do it the "boot" way.
Help would be appreciated.
In order to represent the several environments use profiles. If you want to know more browse this site. and I think this is exactly what you are looking for.
Update 1:
Considering you have a fixed suffix of your property files and you have a set of property files for different environment, for example,
development-it_wroks.properties,
test-it_wroks.properties etc. etc.
etc.it_wroks
is the suffix
Determine the active enviourment from active_env.properties
profiles.active: development
#profiles.active: test
#profiles.active: stage
#profiles.active: production
Write a custom Property resolver
import org.apache.commons.configuration2.Configuration;
import org.apache.commons.configuration2.FileBasedConfiguration;
import org.apache.commons.configuration2.PropertiesConfiguration;
import org.apache.commons.configuration2.builder.FileBasedConfigurationBuilder;
import org.apache.commons.configuration2.builder.fluent.Parameters;
public class MyPropertyUtil {
public static String getValuesFromPerpertyFile(String filename,String key){
String value = null;
Configuration config = getConfiguration(filename);
value = config.getString(key);
return value;
}
public static Configuration getConfiguration(String file){
Configuration config = null;
try{
Parameters params = new Parameters();
FileBasedConfigurationBuilder<FileBasedConfiguration>
builder =new FileBasedConfigurationBuilder
<FileBasedConfiguration>(PropertiesConfiguration.class)
.configure(params.properties().setFileName(file));
config = builder.getConfiguration();
}catch(Exception ex){
ex.printStackTrace();
}finally{
}
return config;
}
}
Now your calling class
import org.apache.log4j.Logger;
public class MyCallingClass {
final static Logger logger = Logger.getLogger(this.getClass());
//Determine the active enviourment,You may determine this from os environment variable if you want
String activeEnvironment = MyPropertyUtil.
getValuesFromPerpertyFile("resource/active_env.properties"
,"profiles.active");
//Set the property file
String myEnvSpecificValue = MyPropertyUtil.
getValuesFromPerpertyFile("resource/"+activeEnvironment+"it_wroks.properties",
"my.property.string");
//Do what you want to
logger.info(myEnvSpecificValue);
}
You can add application-environment.properties as per environment. Spring boot should automatically detect the corresponding properties file based on active environment.

wildfly: reading properties from configuration directory

I'm trying to read deployment specific information from a properties file in my wildfly configuration folder. I tried this:
#Singleton
#Startup
public class DeploymentConfiguration {
protected Properties props;
#PostConstruct
public void readConfig() {
props = new Properties();
try {
props.load(getClass().getClassLoader().getResourceAsStream("my.properties"));
} catch (IOException e) {
// ... whatever
}
}
But apparently this is not working since the configuration folder is not in the classpath anymore. Now I can't find an easy way to do it. My favorite would be something like this:
#InjectProperties("my.properties")
protected Properties props;
The only solution I found on the web so far involves making my own OSGi module, but I believe there must be an easier way to do it (one without OSGi!). Can anyone show me how?
If you want to explicitly read a file from the configuration directory (e.g. $WILDFLY_HOME/standalone/configuration or domain/configuration) there's a system property with the path in it. Simply do System.getProperty("jboss.server.config.dir"); and append your file name to that to get the file.
You wouldn't read it as a resource though, so...
String fileName = System.getProperty("jboss.server.config.dir") + "/my.properties";
try(FileInputStream fis = new FileInputStream(fileName)) {
properties.load(fis);
}
Then the file would be loaded for you.
Also, since WildFly doesn't ship with OSGi support anymore, I don't know how creating an OSGi module would help you here.
Here is a full example using just CDI, taken from this site.
Create and populate a properties file inside the WildFly configuration folder
$ echo 'docs.dir=/var/documents' >> .standalone/configuration/application.properties
Add a system property to the WildFly configuration file.
$ ./bin/jboss-cli.sh --connect
[standalone#localhost:9990 /] /system-property=application.properties:add(value=${jboss.server.config.dir}/application.properties)
This will add the following to your server configuration file (standalone.xml or domain.xml):
<system-properties>
<property name="application.properties" value="${jboss.server.config.dir}/application.properties"/>
</system-properties>
Create the singleton session bean that loads and stores the application wide properties
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
import java.util.Properties;
import javax.annotation.PostConstruct;
import javax.ejb.Singleton;
#Singleton
public class PropertyFileResolver {
private Logger logger = Logger.getLogger(PropertyFileResolver.class);
private String properties = new HashMap<>();
#PostConstruct
private void init() throws IOException {
//matches the property name as defined in the system-properties element in WildFly
String propertyFile = System.getProperty("application.properties");
File file = new File(propertyFile);
Properties properties = new Properties();
try {
properties.load(new FileInputStream(file));
} catch (IOException e) {
logger.error("Unable to load properties file", e);
}
HashMap hashMap = new HashMap<>(properties);
this.properties.putAll(hashMap);
}
public String getProperty(String key) {
return properties.get(key);
}
}
Create the CDI Qualifier. We will use this annotation on the Java variables we wish to inject into.
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
import javax.inject.Qualifier;
#Qualifier
#Retention(RetentionPolicy.RUNTIME)
#Target({ ElementType.METHOD, ElementType.FIELD, ElementType.CONSTRUCTOR })
public #interface ApplicationProperty {
// no default meaning a value is mandatory
#Nonbinding
String name();
}
Create the producer method; this generates the object to be injected
import javax.enterprise.inject.Produces;
import javax.enterprise.inject.spi.InjectionPoint;
import javax.inject.Inject;
public class ApplicaitonPropertyProducer {
#Inject
private PropertyFileResolver fileResolver;
#Produces
#ApplicationProperty(name = "")
public String getPropertyAsString(InjectionPoint injectionPoint) {
String propertyName = injectionPoint.getAnnotated().getAnnotation(ApplicationProperty.class).name();
String value = fileResolver.getProperty(propertyName);
if (value == null || propertyName.trim().length() == 0) {
throw new IllegalArgumentException("No property found with name " + value);
}
return value;
}
#Produces
#ApplicationProperty(name="")
public Integer getPropertyAsInteger(InjectionPoint injectionPoint) {
String value = getPropertyAsString(injectionPoint);
return value == null ? null : Integer.valueOf(value);
}
}
Lastly inject the property into one of your CDI beans
import javax.ejb.Stateless;
import javax.inject.Inject;
#Stateless
public class MySimpleEJB {
#Inject
#ApplicationProperty(name = "docs.dir")
private String myProperty;
public String getProperty() {
return myProperty;
}
}
The simplest thing you can do is to run standalone.sh with a -P option referencing your properties file (you need a URL file:/path/to/my.properties, or put the file in $WILDFLY_HOME/bin).
Then all properties from the file will be loaded as system properties.
For injecting configuration properties into your application classes, have a look at DeltaSpike Configuration, which supports different property sources like system properties, environment variables, JNDI entries and hides the specific source from your application.
Alternatively, to avoid setting system properties (which will be global in the sense of being visible to all applications deployed to your WildFly instance), you can also define a custom property source for DeltaSpike reading a properties file from any given location, and these properties will be local to your application.
It sounds like the problem you are trying to solve is managing different (but probably similar) configuration files for running your application in different environments (ie, Production, QA, or even different customers). If that is the case, take a look at Jfig http://jfig.sourceforge.net/ . It would obviate the need for storing property files outside your classpath (but you still could).
What is needed is a hierarchical approach to configuration files. The ninety percent of configuration values that do not change can be maintained in a base file. The other ten percent (or less) may be maintained in their own distinct configuration file. At run time, the files are layered on top of each other to provide a flexible, manageable configuration. For example, in a development environment myhost.config.xml combines with dev.config.xml and base.config.xml to form my unique configuration.
Each configuration file may then be maintained in version control as they have unique names. Only the base files need to be modified when base values change, and it is easy to see the difference between versions. Another major benefit is that changes to the base configuration file will be exhaustively tested before deployment.
InputStream in = null;
File confDir = new File(System.getProperty("jboss.server.config.dir"));
File fileProp = new File(confDir, "my.properties");
try{
//teste fileProp.exists etc.
in = new FileInputStream(fileProp);
Properties properties = new Properties();
properties.load(in);
//You should throws or handle FileNotFoundException and IOException
}finally{
try{
in.close();
}catch(Exception ignored){
}
}
To avoid this kind of problem the issue is to set the jboss.server.config.dir in VM arguments like that :
-Djboss.server.config.dir="[jboss_repository]/server/[default-all-standard-standalone]/conf" –server
If you have in standalone.xml property:
<property name="my.properties" value="propertyValue"/>
you can wasily read it with:
static final String MY_PROPERTY = System.getProperty("my.properties");
Or if you specify context param in web.xml like:
<context-param>
<param-name>MyProperty</param-name>
<param-value>MyPropertyValue</param-value>
</context-param>
You can read it in Java bean:
String myProperty= getServletContext().getInitParameter("MyProperty");

How do you implement a ManagedServiceFactory in OSGi?

Im currently trying to setup my own implementation of a ManagedServiceFactory. Here is what I'm trying to do: I need multiple instances of some service on a per-configuration base. With DS the components worked perfectly but now I found out that these services should handle there own lifecycle (i.e. (de)registration at the service registry) depending on the availability of some external resource, which is impossible with DS.
Thus my idea was to create a ManagedServiceFactory, which then would receive configs from the ConfigurationAdmin and create instances of my class. These again would try to connect to the resource in a seperate thread and register themselves as service when they're ready to operate.
Since I had no luck implementing this yet, I tried to break everything down to the most basic parts, not even dealing with the dynamic (de)registration, just trying to get the ManagedServiceFacotry to work:
package my.project.factory;
import java.util.Dictionary;
import java.util.HashMap;
import java.util.Hashtable;
import java.util.Map;
import org.osgi.framework.BundleActivator;
import org.osgi.framework.BundleContext;
import org.osgi.framework.Constants;
import org.osgi.framework.ServiceRegistration;
import org.osgi.service.cm.ConfigurationException;
import org.osgi.service.cm.ManagedServiceFactory;
public class Factory implements BundleActivator, ManagedServiceFactory {
private ServiceRegistration myReg;
private BundleContext ctx;
private Map<String, ServiceRegistration> services;
#Override
public void start(BundleContext context) throws Exception {
System.out.println("starting factory...");
this.ctx = context;
java.util.Dictionary properties = new Hashtable<String, Object>();
properties.put(Constants.SERVICE_PID, "my.project.servicefactory");
myReg = context.registerService(ManagedServiceFactory.class, this,
properties);
System.out.println("registered as ManagedServiceFactory");
services = new HashMap<String, ServiceRegistration>();
}
#Override
public void stop(BundleContext context) throws Exception {
for(ServiceRegistration reg : services.values()) {
System.out.println("deregister " + reg);
reg.unregister();
}
if(myReg != null) {
myReg.unregister();
} else {
System.out.println("my service registration as already null " +
"(although it shouldn't)!");
}
}
#Override
public String getName() {
System.out.println("returning facotry name");
return "ServiceFactory";
}
#Override
public void updated(String pid, Dictionary properties)
throws ConfigurationException {
System.out.println("retrieved update for pid " + pid);
ServiceRegistration reg = services.get(pid);
if (reg == null) {
services.put(pid, ctx.registerService(ServiceInterface.class,
new Service(), properties));
} else {
// i should do some update here
}
}
#Override
public void deleted(String pid) {
ServiceRegistration reg = services.get(pid);
if (reg != null) {
reg.unregister();
}
}
}
Now, it should receive configurations from the ConfigurationAdmin for PID my.project.servicefactory, shouldn't it?
But it does not receive any configurations from the ConfigurationAdmin. The bundle is started, the service is registered and in the web console, I can see the config admin holds a reference to my ManagedServiceFactory. Is there a certain property which should be set? The interface specification does not suggest that. Actually my implementation is more or less the same as the example there. I've no idea what I'm doing wrong here, any pointers to the solutions are very welcome.
Also, I orginally thought to implement the ManagedServiceFactory itself as DS, which also should be possible, but I failed at the same point: no configurations are handed over by the ConfigAdmin.
update
To clarify the question: I think that this is mainly an configuration problem. As I see it, I should be able to specify two PIDs for the factory, one which identifies a configuration for the factory itself (if any), and one which would produce services trough this factory, which I thought should be the factory.pid. But the framework constants do not hold anything like this.
update 2
After searching a bit the Felix Fileinstall source code, I found out that it treats configuration files differently when there is a - in the filename or not. Having the configuration file named my.project.servicefactory.cfg it did not work, but the configs named my.project.servicefactory-foo.cfg and my.project.servicefactory-bar.cfg were properly handed over to my ManagedServiceFactory as expected, and multiple services with ServiceInterface were registered. Hurray!
update 3
As proposed by Neil, I put the declarative service part in a new question to bound the scope of this one.
I think that the problem is you have a singleton configuration record rather than a factory record. You need to call Config Admin with the createFactoryConfiguration method using my.project.servicefactory as the factoryPid.
If you are using Apache Felix FileInstall (which is a nice easy way to create config records without writing code) then you need to create a file called my.project.servicefactory-1.cfg in the load directory. You can create further configurations with the same factoryPID by calling them my.project.servicefactory-2.cfg etc.

Categories

Resources