Add Converter to Log4j2 programmatically - java

I am using log4j2. And i have deffined custom appender and pattern for it and setting in up programatically:
public static void initCustomLogging(List<Appender> appenders) {
LoggerContext loggerContext = (LoggerContext) LogManager.getContext(false);
Configuration configuration = loggerContext.getConfiguration();
LoggerConfig rootLoggerConfig = configuration.getLoggerConfig("");
for (Appender appender : appenders) {
appender.start();
rootLoggerConfig.addAppender(appender, Level.INFO, null);
}
loggerContext.updateLoggers();
}
Where i create my appender like:
private static final String CUSTOM_PATTERN = "{\"time\":\"%d{HH:mm:ss.SSS}\",\"data\":%msg}";
private static final Layout CUSTOM_LAYOUT = PatternLayout.newBuilder()
.withPattern(CUSTOM_PATTERN)
.build();
Appender appender = CustomAppender.createAppender("custom",CUSTOM_LAYOUT)
However i want to add custom arguments to my pattern, for example [%requestInfo] where this would call some static method. I know i can define my converter like:
#Plugin(name = "MyConverter", category = "Converter")
#ConverterKeys({"requestData"})
public class MyConverter extends LogEventPatternConverter {
private MyConverter (final String name, final String style) {
super(name, style);
}
public static MyConverter newInstance(final String[] options) {
return new MyConverter ("requestData", "requestData");
}
#Override
public void format(final LogEvent event, final StringBuilder toAppendTo) {
toAppendTo.append(getRequestId());
}
private String getRequestData() {
String data= // some static method call
if (data== null) {
data= "-";
}
return data;
}
}
If i was configurating log4j2 via xml config file, this would get picked up automaticly and configured for me. However since i am configurating it programmatically - how do i add this converted to my layout? I see no method of doing it.
Thanks for help!

This should be equivalent to scanning the plugins under the specified package:
PluginManager.addPackage("yourpackage");

Related

Persisting Path objects with spring-data-mongodb respositories

In a project I use spring-boot-starter-data-mongodb:2.5.3 and therefore spring-data-mongodb:3.2.3 and have an entity class that simplified looks like this:
#Document
public class Task {
#Id
private final String id;
private final Path taskDir;
...
// constructor, getters, setters
}
with a default Spring MongoDB repository that allows to retrieve the task via its id.
The Mongo configuration looks as such:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoClient(), getDatabaseName());
}
}
On attempting to save a task via taskRepository.save(task) Java ends up in a StackOverflowError
java.lang.StackOverflowError
at java.lang.ThreadLocal.get(ThreadLocal.java:160)
at java.util.concurrent.locks.ReentrantReadWriteLock$Sync.tryReleaseShared(ReentrantReadWriteLock.java:423)
at java.util.concurrent.locks.AbstractQueuedSynchronizer.releaseShared(AbstractQueuedSynchronizer.java:1341)
at java.util.concurrent.locks.ReentrantReadWriteLock$ReadLock.unlock(ReentrantReadWriteLock.java:881)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:239)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:201)
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentEntity(AbstractMappingContext.java:87)
at org.springframework.data.mapping.context.MappingContext.getRequiredPersistentEntity(MappingContext.java:73)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:740)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:746)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeProperties(MappingMongoConverter.java:657)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:633)
...
On annotating the path object taskDir in the Task class with #Transient I'm able to persist the task, so the problem seems to be related with Java/Spring/MongoDB being unable to handle Path objects directly.
My next attempt was to configure a custom converter inside the MongoConfig class to convert between Path and String representations:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
converterConfigurationAdapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
though the error remained. I then defined a direct conversion between the Task object and a DBObject as showcased in this guide
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter converterConfigurationAdapter) {
LOG.info("configuring converters");
converterConfigurationAdapter.registerConverter(new Converter<Task, DBObject>() {
#Override
public DBObject convert(#Nonnull Task source) {
DBObject dbObject = new BasicDBObject();
if (source.getTaskDirectory() != null) {
dbObject.put("taskDir", source.getTaskDirectory().normalize().toAbsolutePath().toString());
}
...
return dbObject;
}
});
}
and I still get a StackOverflowError in return. Through the log statement I added I see that Spring called into the configureConverters method and therefore should have registered the custom converters.
Why do I still get the StackOverflowError though? How do I need to tell Spring to treat Path objects as Strings while persisting and on read-time convert the String value to a Path object back again?
Update:
I've now followed the example given in the official documentation and refactored the converter to its own class
import org.bson.Document;
import org.springframework.core.convert.converter.Converter;
import org.springframework.data.convert.WritingConverter;
import javax.annotation.Nonnull;
#WritingConverter
public class TaskWriteConverter implements Converter<Task, Document> {
#Override
public Document convert(#Nonnull Task source) {
Document document = new Document();
document.put("_id", source.getId());
if (source.getTaskDir() != null) {
document.put("taskDir", source.getTaskDir().normalize().toAbsolutePath().toString());
}
return document;
}
}
The configuration in the MongoConfig class now looks like this:
#Override
protected void configureConverters(
MongoCustomConversions.MongoConverterConfigurationAdapter adapter) {
LOG.info("configuring converters");
adapter.registerConverter(new TaskWriteConverter());
adapter.registerConverter(new TaskReadConverter());
adapter.registerConverter(new Converter<Path, String>() {
#Override
public String convert(#Nonnull Path path) {
return path.normalize().toAbsolutePath().toString();
}
});
adapter.registerConverter(new Converter<String, Path>() {
#Override
public Path convert(#Nonnull String path) {
return Paths.get(path);
}
});
}
After changing the logging level for org.springframework.data to debug I see in the logs that these converters also got picked up:
2021-09-23 14:09:20.469 [INFO ] [ main] MongoConfig configuring converters
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class com.acme.Task to class org.bson.Document as writing converter.
2021-09-23 14:09:20.480 [DEBUG] [ main] CustomConversions Adding user defined converter from class org.bson.Document to class com.acme.Task as reading converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from interface java.nio.file.Path to class java.lang.String as writing converter.
2021-09-23 14:09:20.481 [DEBUG] [ main] CustomConversions Adding user defined converter from class java.lang.String to interface java.nio.file.Path as reading converter.
However, I see the that most of the converters are added multiple times, i.e. I find a log for Adding converter from class java.lang.Character to class java.lang.String as writing converter. actually 4 times before the application hits the save method on the repository. As my custom converters are only added the 3rd time all of these converters appear in the logs, I have a feeling that they are somehow overwritten as the logs in the last "iteration" don't include my custom converters.
The test case that reproduces that issue is as follows:
#ĹšpringBootTest
#AutoConfigureMockMvc
#PropertySource("classpath:application-test.properties")
public class SomeIT {
#Autowired
private TaskRepository taskRepository;
...
#Test
public void testTaskPersistence() throws Exception {
Task task = new Task("1234", Paths.get("/home/roman"));
taskRepository.save(task);
}
...
}
The test-method is only used to investigate into the current persistence issue and under normal conditions shouldn't be there at all as the integration test tests the upload of a large file, its preprocessing and so on. This integration tests however fails due to Spring not being able, at least it seems so, to store entities that contain Path objects.
Note that for simple entities I do not have issues in persisting them with the outlined setup and I also seem them in the dockerized MongoDB.
I haven't had time yet to dig deeper into why Spring has such problems with Path objects or why my custom converters suddenly disappear in the last iteration of the CustomConversions log output.
It turns out that the way the mongoTemplate is configured does "overwrite" any specified custom converters and thus Spring is not able to make use of these and convert Path to String and vice versa.
After changing the MongoConfig to the one below, I'm finally able to use my custom converters and thus persist entities as expected:
#Configuration
#EnableMongoRepositories(basePackages = {
"path.to.repository"
}, mongoTemplateRef = MongoConfig.MONGO_TEMPLATE_REF)
#EnableConfigurationProperties(MongoSettings.class)
public class MongoConfig extends MongoConfigurationSupport {
private static final Logger LOG = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
public static final String MONGO_TEMPLATE_REF = "mongoAlTemplate";
private final MongoSettings mongoSettings;
#Autowired
public MongoConfig(final MongoSettings mongoSettings) {
this.mongoSettings = mongoSettings;
}
#Bean(name = "ourMongo", destroyMethod = "close")
public MongoClient ourMongoClient() {
MongoCredential credential =
MongoCredential.createCredential(mongoSettings.getUser(),
mongoSettings.getDb(),
mongoSettings.getPassword());
MongoClientSettings clientSettings = MongoClientSettings.builder()
.readPreference(ReadPreference.primary())
// enable optimistic locking for #Version and eTag usage
.writeConcern(WriteConcern.ACKNOWLEDGED)
.credential(credential)
.applyToSocketSettings(
builder -> builder.connectTimeout(15, TimeUnit.SECONDS)
.readTimeout(1, TimeUnit.MINUTES))
.applyToConnectionPoolSettings(
builder -> builder.maxConnectionIdleTime(10, TimeUnit.MINUTES)
.minSize(5).maxSize(20))
// .applyToClusterSettings(
// builder -> builder.requiredClusterType(ClusterType.REPLICA_SET)
// .hosts(Arrays.asList(new ServerAddress("host1", 27017),
// new ServerAddress("host2", 27017)))
// .build())
.build();
LOG.info("Mongo client initialized. Connecting with user {} to DB {}",
mongoSettings.getUser(), mongoSettings.getDb());
return MongoClients.create(clientSettings);
}
#Override
#Nonnull
protected String getDatabaseName() {
return mongoSettings.getDb();
}
#Bean
public MongoDatabaseFactory ourMongoDBFactory() {
return new SimpleMongoClientDatabaseFactory(ourMongoClient(), getDatabaseName());
}
#Bean(name = MONGO_TEMPLATE_REF)
public MongoTemplate ourMongoTemplate() throws Exception {
return new MongoTemplate(ourMongoDBFactory(), mappingMongoConverter());
}
#Bean
public MappingMongoConverter mappingMongoConverter() throws Exception {
DbRefResolver dbRefResolver = new DefaultDbRefResolver(ourMongoDBFactory());
MongoCustomConversions customConversions = customConversions();
MongoMappingContext context = mongoMappingContext(customConversions);
MappingMongoConverter converter = new MappingMongoConverter(dbRefResolver, context);
// this one is actually needed otherwise the StackOverflowError re-appears!
converter.setCustomConversions(customConversions);
return converter;
}
#Bean
#Override
#Nonnull
public MongoCustomConversions customConversions() {
return new MongoCustomConversions(
Arrays.asList(new PathWriteConverter(), new PathReadConverter())
);
}
}
So, instead of passing the MongoClient and the database name to the mongoTemplate directly, a MongoDatabaseFactory object holding the above mentioned values and a MappingMongoConverter object are passed as input to the template.
Unfortunately, it is necessary to pass the customConversion object twice within the mappingMongoConverter() method. If not done so, the StackOverflowError reappears.
With the given configuration, conversions from Path to String and String to Path are now possible and thus no custom conversions from Task to Document and vice versa are currently needed.

How to set default values of model class variables from yaml file?

In a service file I would simply use #Value and initialize the variable instially there. I have tried this approach in a model class but (I assume how things get autowired and that its a model class) this results in it always being null.
The need for this comes out that in different environments the default value is always different.
#Value("${type}")
private String type;
I would avoid trying to use Spring logic inside the models as they are not Spring beans themselves. Maybe use some form of a creational (pattern) bean in which the models are constructed, for example:
#Component
public class ModelFactory {
#Value("${some.value}")
private String someValue;
public SomeModel createNewInstance(Class<SomeModel> clazz) {
return new SomeModel(someValue);
}
}
public class SomeModel {
private String someValue;
public SomeModel(String someValue) {
this.someValue = someValue;
}
public String getSomeValue() {
return someValue;
}
}
#ExtendWith({SpringExtension.class})
#TestPropertySource(properties = "some.value=" + ModelFactoryTest.TEST_VALUE)
#Import(ModelFactory.class)
class ModelFactoryTest {
protected static final String TEST_VALUE = "testValue";
#Autowired
private ModelFactory modelFactory;
#Test
public void test() {
SomeModel someModel = modelFactory.createNewInstance(SomeModel.class);
Assertions.assertEquals(TEST_VALUE, someModel.getSomeValue());
}
}

How to read OSGI configuration via plain Java class

I need to get some OSGI configuration values via plain Java class which is not registered as service so I cannot use #Reference or #Inject annotation. I have used Bundle context to get the config but it is working.
public void getArticleName() {
final BundleContext bundleContext = FrameworkUtil.getBundle(ArticleNameService.class).getBundleContext();
try {
String articleName = (String) bundleContext.getService((bundleContext.getServiceReferences(ArticleNameService.class.getName(), " article.name "))[0]);
LOG.info("articleName......"+ articleName);
} catch (InvalidSyntaxException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Service class
#Service(ArticleNameService.class)
#Component(
metatype = true)
#Properties({
#Property(
name = "article.name", unbounded = PropertyUnbounded.ARRAY, cardinality = Integer.MAX_VALUE,
label = "article addrnameess"),
})
public class ArticleNameServiceImpl implements ArticleNameService
{
private static final String ARTICLE_NAME = "article.name";
private String[] articleName;
protected final void activate(final ComponentContext componentContext)
{
final Dictionary<String, Object> configurationProperties = componentContext.getProperties();
if (configurationProperties != null)
{
articleName = PropertiesUtil.toStringArray(configurationProperties.get(ARTICLE_NAME));
}
}
#Override
public final String[] getArticeName()
{
return articleName;
}
is it correct way of doing? if not what is correct option to get it?
You can get any configuration using ConfigurationAdmin. For your DS components the pid by default is the FQName of your component class.
Bundle bundle = FrameworkUtil.getBundle(this.getClass());
BundleContext context = bundle.getBundleContext();
ServiceReference<ConfigurationAdmin> reference = context.getServiceReference(ConfigurationAdmin.class);
ConfigurationAdmin configAdmin = context.getService(reference);
Configuration conf = configAdmin.getConfiguration("yourpid");
String articleName = (String)conf.getProperties().get("article.name");
context.ungetService(reference);

Spring Boot Configuration Properties not being set

I am trying to set up a clamav virus scanner in a Spring Boot environment. So I want to set the host and port in a properties file, clamav.properties, located in my resources directory along with the application.properties file. it looks like this:
clamav.host=localhost
clamav.port=3310
clamav.timeout=1000
I have this class:
#ConfigurationProperties("clamav.properties")
public class ClamAvClient {
static final Logger logger = LoggerFactory.getLogger(ClamAvClient.class);
#Value("${clamav.host}")
private String clamHost;
#Value("${clamav.port}")
private int clamPort;
#Value("${clamav.timeout}")
private int clamTimeout;
public boolean ping() throws IOException {
logger.debug("Host:"+clamHost+" Port:"+clamPort);
blah.....
}
private static byte[] asBytes(String s) {
return s.getBytes(StandardCharsets.US_ASCII);
}
public String getClamHost() {
return clamHost;
}
public void setClamHost(String clamHost) {
this.clamHost = clamHost;
}
public int getClamPort() {
return clamPort;
}
public void setClamPort(int clamPort) {
this.clamPort = clamPort;
}
public int getClamTimeout() {
return clamTimeout;
}
public void setClamTimeout(int clamTimeout) {
this.clamTimeout = clamTimeout;
}
}
It's not connecting and in the logs I get this:
2017-09-23 20:39:45.947 DEBUG 28857 --- [http-nio-8080-exec-2] xxx.ClamAvClient : Host:null Port:0
So those values are clearly not getting set. What am I doing wrong? I am using the managed version of spring-boot-starter-web, which my Eclipse is saying is 1.4.3-RELEASE
Any Ideas?
Either use #ConfigurationProperties to map group of properties to a Class using Configuration Processor.
Using #Value inside #ConfigurationProperties doesn`t look right.
All you need to map your properties to the class is :
#Configuration
#ConfigurationProperties(prefix="clamav")
public class ClamAvClient {
static final Logger logger = LoggerFactory.getLogger(ClamAvClient.class);
private String host;
private int port;
private int timeout;
//getters and setters
}
prefix ="clamav" matches your prefixes in the properties file.
host,port,timeout matches the properties of the class.

Error injecting constructor in play java mongodb

I'm trying to inject my dao object in controller. I've done this:
I've a:
1. MongoDBHelper
2. MerchantDAO
3. MerchantService
4. MerchantController
This is MongoDBHelper class:
import javax.inject.Singleton;
#Singleton
public class MongoDBHelper {
private DB db;
private Datastore datastore;
private Configuration config = Play.application().configuration();
private final String SERVER_URL = config.getString("server_url");
private final String USERNAME = config.getString("database.userName");
private final String PASSWORD = config.getString("database.password");
private final String DATABASE_NAME = config.getString("database.name");
public MongoDBHelper() {
try {
MongoClient mongoClient = new MongoClient();
this.db = mongoClient.getDB(DATABASE_NAME);
this.db.authenticate(USERNAME, PASSWORD.toCharArray());
Morphia morphia = new Morphia();
this.datastore = morphia.createDatastore(mongoClient, DATABASE_NAME);
morphia.mapPackage("models");
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
public DB getDB() {
return this.db;
}
public Datastore getDatastore() {
return this.datastore;
}
}
This is MerchantDAO class
public class MerchantDAO {
#Inject MongoDBHelper mongoDBHelper;
private Datastore datastore = mongoDBHelper.getDatastore();
private DB db = mongoDBHelper.getDB();
private static final String AUTH_TOKEN = "authToken";
private static final Config config = ConfigFactory.load(Play.application().configuration().getString("property.file.name"));
public void updateMerchantWithAuthToken(Merchant merchant){
Query<Merchant> query = datastore.createQuery(Merchant.class).field(config.getString("string.email")).equal(merchant.getEmail());
UpdateOperations<Merchant> ops = datastore.createUpdateOperations(Merchant.class).set(AUTH_TOKEN, merchant.getAuthToken()).set("lastRequestTime",merchant.getLastRequestTime());
UpdateResults res = datastore.update(query, ops);
}
}
}
This is MerchantService class:
public class MerchantService {
static final Config config = ConfigFactory.load(Play.application().configuration().getString("property.file.name"));
#Inject
MerchantDAO merchantDAO;
// Creating unique authToken for already logged in merchant
public String createToken(Merchant merchant) {
merchantDAO.updateMerchantWithAuthToken(merchant);
return authToken;
}
}
This is MerchantController
import javax.inject.Inject;
public class MerchantController extends Controller {
#Inject MerchantService merchantService;
public final static String AUTH_TOKEN_HEADER = "X-AUTH-TOKEN";
public static final String AUTH_TOKEN = "authToken";
public static final Config config = ConfigFactory.load(Play.application().configuration().getString("property.file.name"));
public static Merchant getMerchant() {
return (Merchant)Http.Context.current().args.get("merchant");
}
public Result login() throws Exception {
// code to perform login
return ok(); // status success / failure
}
}
I'm getting following error:
ProvisionException: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.NullPointerException
at daos.MerchantDAO.<init>(MerchantDAO.java:22)
while locating daos.MerchantDAO
for field at services.MerchantService.merchantDAO(MerchantService.java:26)
while locating services.MerchantService
for field at controllers.MerchantController.merchantService(MerchantController.java:21)
while locating controllers.MerchantController
for parameter 2 at router.Routes.<init>(Routes.scala:36)
while locating router.Routes
while locating play.api.inject.RoutesProvider
while locating play.api.routing.Router
1 error
What am I possibly doing wrong? Why is DI not working properly?
Thanks in advance.
I think the problem is with these lines:
private Datastore datastore = mongoDBHelper.getDatastore();
private DB db = mongoDBHelper.getDB();
These are evaluated during the object instance's construction. I believe that injection won't occur until AFTER the object instance has completed construction. Therefore, mongoDBHelper is null while the above assignments are made.
One way to solve this would be to set datastore and db in the method updateMerchantWithAuthToken.
The problem is that you are trying to access the Configuration object during the MongoDBHelper instantiation. You should just inject the play Configuration object to your module's constructor and initialize all properties within the constructor:
#Inject
public MongoDBHelper(Configuration configuration) {
config = Play.application().configuration();
<read the rest of the config values here>
See the note in the configurable bindings section of the D.I. documentation here

Categories

Resources