LoggerProducer.java is a class used to produce Loggers to be injected in CDI beans with:
#Inject
Logger LOG;
Full code:
import javax.ejb.Singleton;
/**
* #author rveldpau
*/
#Singleton
public class LoggerProducer {
private Map<String, Logger> loggers = new HashMap<>();
#Produces
public Logger getProducer(InjectionPoint ip) {
String key = getKeyFromIp(ip);
if (!loggers.containsKey(key)) {
loggers.put(key, Logger.getLogger(key));
}
return loggers.get(key);
}
private String getKeyFromIp(InjectionPoint ip) {
return ip.getMember().getDeclaringClass().getCanonicalName();
}
}
QUESTION: can #Singleton be safely turned into #ApplicationScoped ?
I mean, why would anyone want an EJB here ? Are there technical reasons, since no transactions are involved, and (AFAIK) it would be thread-safe anyway ?
I'm obviously referring to javax.enterprise.context.ApplicationScoped, not to javax.faces.bean.ApplicationScoped.
The #Singleton annotation provides not only transaction but also thread-safety by default. So if you will replace it with #ApplicationScoped, you will loose the synchronization. So in order to make it properly you need to do like this:
#ApplicationScoped
public class LoggerProducer {
private final ConcurrentMap<String, Logger> loggers = new ConcurrentHashMap<>();
#Produces
public Logger getProducer(InjectionPoint ip) {
String key = getKeyFromIp(ip);
loggers.putIfAbsent(key, Logger.getLogger(key));
return loggers.get(key);
}
private String getKeyFromIp(InjectionPoint ip) {
return ip.getMember().getDeclaringClass().getCanonicalName();
}
}
Also you can make it completely without any scope if you make the map as static
Related
I'm interested in using HK2 or Guice for a dependency injection framework.
I know of #Named, #Qualifier, and custom annotations etc. But these are all compile-time.
I am looking for a facility to dynamically determine the desired concrete type based on runtime context and inject the correct implementation.
Is there something like that in HK2 or Guice or a recommended way of achieving this?
For example:
// I would want to turn this...
public final class Handler
{
private final Session session;
#Inject
public Handler(#Named("Database") final Session session)
{
this.session = session;
}
...
}
// into something like this...
public final class Handler
{
private final Session session;
#Inject
public Handler(final Session session)
{
this.session = session;
}
}
// where "session" is injected based on some previous context value ("Database")
// or something to that effect.
I ended up using a feature in HK2 called Operations (link to docs). It allows a user of HK2 to define custom scopes and manage them as "operations". You can find a more detailed example of how to use the feature on HK2's github project: operations example.
This is a simplified example of how I ended up using this feature to inject things based on context or in this case "scope".
Here is some almost-working pseudo-code to demonstrate my usage:
// Create the custom scope annotation.
#Scope
#Proxiable(proxyForSameScope = false)
#Documented
#Target({ ElementType.TYPE, ElementType.METHOD })
#Retention(RetentionPolicy.RUNTIME)
public #interface BatchScope
{
public static final BatchScope INSTANCE = new BatchScopeEnvoy();
}
final class BatchScopeEnvoy extends AnnotationLiteral<BatchScope> implements BatchScope
{
private static final long serialVersionUID = 938233179310254573L;
}
// Create a context used by the HK2 operation feature.
#Singleton
public final class BatchScopeContext extends OperationContext<BatchScope>
{
#Override
public Class<? extends Annotation> getScope()
{
return BatchScope.class;
}
}
// Create a class that holds your custom scope data/context.
public final class BatchScopeRuntime
{
// ... Arbitrary runtime data here ...
public SomeData getData()
{
return this.data;
}
}
// Create a factory that serves up something you want to inject from a custom scope.
#Singleton
public final class DataFactory implements Factory<SomeData>
{
private final OperationManager operations;
#Inject
public BatchInfoFactory(final OperationManager operations)
{
Sentinel.assertIsNotNull(operations);
this.operations = operations;
}
// The #BatchScope on the provide() method indicates that objects returned
// from this factory are in the "BatchScope".
#Override
#BatchScope
public IBatchInfo provide()
{
final OperationHandle handle = this.operations.getCurrentOperation(BatchScope.INSTANCE);
final BatchScopeRuntime runtime = (BatchScopeRuntime)handle.getOperationData();
return runtime.getData();
}
#Override
public void dispose(final IBatchInfo instance)
{
// Do nothing.
}
}
// Setup the injector.
public static ServiceLocator createInjector(final String name)
{
final ServiceLocator injector = ServiceLocatorFactory.getInstance().create(name);
ServiceLocatorUtilities.bind(
injector,
new AbstractBinder()
{
#Override
protected void configure()
{
// This creates a "Singleton" factory that provides
// "SomeData" instances at "BatchScope".
bindFactory(DataFactory.class, Singleton.class)
.to(SomeData.class)
.in(BatchScope.class);
}
}
return injector;
}
// Create a class that needs something in the custom scope.
public final class Foo
{
#Inject
public Foo(final SomeData data)
{
System.out.printf("I got: %s%n", data);
}
}
// Usage: how to manage the scopes using the operations feature.
final SomeData data = ... // get some data
final BatchScopeRuntime runtime = new BatchScopeRuntime(data); // Setup the runtime information.
// Create an operation handle for the custom scope and associate the custom data with it.
final ServiceLocator injector = createInjector("test");
ServiceLocatorUtilities.addClasses(injector, BatchScopeContext.class, Foo.class);
final OperationManager operations = injector.getService(OperationManager.class);
final OperationHandle<BatchScope> batchScope = operations.createAndStartOperation(BatchScope.INSTANCE);
// Operation/scope is now associated with the current thread.
batchScope.setOperationData(runtime);
// Foo will now be injected with: "data" from above.
final Foo foo = injector.getService(Foo.class);
// Do some work...
// Close the operation (make it go out of scope) on the current thread.
batchScope.closeOperation();
Is it possible to Autowire fields in a dynamic class?
I am getting a class name from the database and I want to autowire this class
Short Answer
That's not possible. Spring needs to know what Beans there are for injecting them.
Long Answer
You could #Autowire every possible bean into a class and then cache them in a Map, where the Class represents the key, and the Object the value. See below simplified example:
public class MyClass{
private final Map<Class<?>, Object> cache = new HashMap<>();
#Autowired
public MyClass(Service1 s1, Service2 s2){
// registering the beans
cache.put(Service1.class, s1);
cache.put(Service2.class, s2);
}
public <T> T getService(String className) throws ClassNotFoundException{
// getting the bean
Class<?> clazz = Class.forName(className);
return (T) cache.get(clazz);
}
}
Not sure it's a good idea, but you can inject a class like mentionned here :
Injecting beans into a class outside the Spring managed context
You can try this:
import javax.annotation.PostConstruct;
#Component
public class ApplicationContextAccessor {
private static ApplicationContextAccessor instance;
#Autowired
private ApplicationContext applicationContext;
public static T getBean(Class clazz) {
return instance.applicationContext.getBean(clazz);
}
#PostConstruct
private void registerInstance() {
instance = this;
}
}
Read this post : https://www.helicaltech.com/uses-of-springs-applicationcontext-while-using-reflection/
I have a Spring-boot Java application which streams data continuously from Kafka and saves it to the database Cassandra after applying business logic.
Below are the pseudo classes and functions which resemble my application fully.
KafkaStreamer
#Configuration
#EnableKafka
public class KafkaStreamer {
private static final Logger LOGGER = LoggerFactory.getLogger(MyDomain.class);
#Autowired
private MyController myController;
#KafkaListener(topics = "${my-topic}", group = "${my-group}")
public void streamFromKafka(String payload) {
myController.processPayload(payload);
LOGGER.info("I am continously streaming data from Kafka "
+ "and forwarding it to controller for further processing...!");
}
}
MyController
#Controller
public class MyController {
private static final Logger LOGGER = LoggerFactory.getLogger(MyDomain.class);
#Autowired
private MyService myService;
public void processPayload(String payload) {
myService.applyBusinessLogic(payload);
LOGGER.info("Send to service for business-logic processing");
}
}
MyService
#Service
public class MyService {
private static final Logger LOGGER = LoggerFactory.getLogger(MyDomain.class);
#Autowired
private MyDomain myDomain;
public void applyBusinessLogic(String payload) {
myDomain.saveToDatabase(payload);
LOGGER.info("Applied business-logic");
}
}
MyDomain
#Repository
public class MyDomain {
private static final Logger LOGGER = LoggerFactory.getLogger(MyDomain.class);
#Autowired
private CassandraOperations cassandraTemplate;
/** The session. */
private Session session = null;
public void saveToDatabase(String payload) {
saveToTableA(payload);
saveToTableB(payload);
// Hello, I have saved data to database
LOGGER.info("Saved data to database");
}
private void saveToTableB(String payload) {
if (session == null)
session = cassandraTemplate.getSession();
session.execute(payload);
}
private void saveToTableA(String payload) {
if (session == null)
session = cassandraTemplate.getSession()
session.execute(payload);
}
}
The above pseudo code resembles my original application fully.
As you can see I do not have any class level variables other than logger, some auto-wired variable and cassandra session in MyDomain class
According to my knowledge, auto-wire by default in spring-boot is singleton.
I am passing payload (which is my message from Kafka) from one class to another class in function argument rather setting as the class level property of other class.
My question is,
Is my above application architecture or code thread safe ?.
Can Autowire create problem as by default it gives singleton reference of a class ( a point to note here is I do not have any class level variables other than logger and auto-wire variable)
If you feel if there exists a better way or anything, please feel free to write.
Many thanks from me.
Your solution stops to be thread-safe if you start to mutate the state of the shared object.
For example if you'd have a payload property in the MyDomain singleton Spring bean and set its value in the saveToDatabase and a bit later (even in the same method) consult it, that would be mutation and your code won't be thread-safe.
For example:
#Repository
public class MyDomain {
private String payload;
public void saveToDatabase(String payload) {
this.payload = payload;
saveToTableA();
saveToTableB();
}
private void saveToTableA() {
tableA.save(this.payload);
}
In this case when several threads call your saveToDatabase concurrently with different values, there is no guarantee what value you really will save to the DB in the saveToTableA().
I have a Controller class which is invoked first in my application. There I was planning to retrieve a value from a Map from a Service class.
Here's the controller:
#Controller
public class AppController {
public Service doSomethingWithTheMap(String key) {
return ServiceImpl.getMapValueFor(key).exec();
}
}
I get issues because during the initialization, well during the put of values to the Service's Map to be more precise, I require the BeanFactory because the values in the Map are Service implementations.
Doing it in a static block will cause the BeanFactory to be null because it is not injected yet I would guess.
So ending up with this initMap() call makes me feel a bit like ... there should be a better solution.
Any hints somebody?
I have to admit that I am new to Spring and maybe I mess things up here. FYI the Map came into my mind after having endless if else checks deciding which Service to call based on a String input. Therefore I replaced it with the Map and a simple one liner in the Controller.
ServiceImpl.getMapValueFor(key).exec();
Here' the Service class:
#Service
public class ServiceImpl {
private static Map<String, Service> map;
private static ApplicationContext context;
#Autowired
public void setApplicationContext(ApplicationContext factory) {
this.context = factory;
}
public static Service getMapValueFor(String key) {
if (map == null) {
initMap();
}
return map.get(key);
}
private static void initMap() {
/*
* FIXME: We can not init the map in a static block or directly
* initialize it since the factory is not injected until execution of a
* static block and will be null.
*/
BeanFactory factory = context;
map = new HashMap<String, Service>();
map.put("key", factory.getBean(SomeService.class));
}
}
The first thing I want to say is that you have a bug, because you are using a HashMap with no synchronization! - Don't be alarmed many (if not most) java developers would make the same mistake.
Unless you have oversimplified the code, your service looks more like a command than a service; A service is a singleton. It is not impossible for services to have methods without arguments, but I would say it is uncommon. Are you sure you should not be using prototype beans instead of singletons ?
Typically the number of services are finite, and if you have multiple services of the same type you would use the #Qualifier when autowiring them. In any case this code looks dodgy to me, so perhaps you should try to explain the problem at a higher level, because there may be a better way than you current code-path.
Your service class ServiceImpl must implement the interface org.springframework.context.ApplicationContextAware to get the instance of Spring's application context.
Here is a very basic solution, it uses the fact that the name of a #Bean is the name of the method which creates it, you will probably need a better strategy. The idea is to hide getBean inside a Provider class which can then be Autowired (and tested)
import org.springframework.beans.BeansException;
import org.springframework.context.ApplicationContext;
import org.springframework.context.ApplicationContextAware;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Bean;
public class Main {
public static void main(String[] args) {
ApplicationContext ctx = new AnnotationConfigApplicationContext(Config.class);
CallableProvider provider = ctx.getBean(CallableProvider.class);
System.out.println(provider.getCommand("aCommand").call());
System.out.println(provider.getCommand("bCommand").call());
}
public static class Config {
#Bean
public ACommand aCommand() {
return new ACommand();
}
#Bean
public BCommand bCommand() {
return new BCommand();
}
#Bean
public CallableProvider callableProvider() {
return new CallableProvider();
}
}
public static class CallableProvider implements ApplicationContextAware {
private ApplicationContext context;
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.context = applicationContext;
}
public Command getCommand(String name) {
return context.getBean(name, Command.class);
}
}
public static class ACommand implements Command {
// autowire stuff
#Override
public String call() {
return "A";
}
}
public static class BCommand implements Command {
// autowire stuff
#Override
public String call() {
return "B";
}
}
public interface Command {
String call();
}
}
I understood that classes that were not JSP-299 compliant could be made available for injection if instantiated via a producer method.
This I have interpreted as meaning that if I want to make injectable a bean with a constructor that takes a parameter I can do so by the use of a producer method.
However when I do this I get the following exception on deployment:
2015-11-11T21:35:12.099+0000|Grave: Exception during lifecycle processing
org.glassfish.deployment.common.DeploymentException: CDI deployment failure:Exception List with 2 exceptions:
Exception 0 :
org.jboss.weld.exceptions.DeploymentException: WELD-001435 Normal scoped bean class org.....MongoConfiguration is not proxyable because it has no no-args constructor - Producer Method [MongoConfiguration] with qualifiers [#Any #Default] declared as [[BackedAnnotatedMethod] #Produces #ApplicationScoped public org.....PropertiesProducer.produceMongoConfiguration()].
Here is the producer:
public class PropertiesProducer {
private static final String PROPERTIES_FILE = "mongo.properties";
private Properties properties = new Properties();
public static final String DATABASE_NAME = "database.name";
public static final String PORT = "database.port";
public static final String HOST = "database.host";
public static final String USERNAME = "database.username";
public static final String PASSWORD = "database.password";
#Produces
#ApplicationScoped
public MongoConfiguration produceMongoConfiguration(){
final InputStream in = Thread.currentThread().getContextClassLoader().getResourceAsStream(PROPERTIES_FILE);
if (in == null) {
return new MongoConfiguration(properties);
}
try {
properties.load(in);
} catch (IOException e) {
throw new RuntimeException("Failed to load properties", e);
}
finally {
try {
if (in != null) {
in.close();
}
} catch (IOException e) {
// don't care
}
}
return new MongoConfiguration(properties);
}
}
Here is the usage:
public class MongoDatastore {
#Inject
MongoConfiguration mongoConfiguration;
#Inject
MongoClient mongoClient;
Datastore datastore;
#PostConstruct
private void setupDatastore() {
Morphia morphia = new Morphia();
datastore = morphia.createDatastore(mongoClient, mongoConfiguration.getDatabaseName());
}
}
Have I missed something really obvious?
The simplest solution is to change the scope from the #ApplicationScoped to #Singleton:
import javax.inject.Singleton;
#Produces
#Singleton
public MongoConfiguration produceMongoConfiguration(){
// ...
return new MongoConfiguration(properties);
}
For clarification you can see this SO answer.
BTW: normal scope #ApplicationScoped is usually preferred over the #Singleton pseudo-scope. Why? E.g. because a serialization + deserialization of such bean will work flawlessly. But in a daily work, sometimes we encounter a 3rd part class that is unproxyable, and we cannot change it. So possible solutions we have:
Use #Singleton pseudo-scope.
Create an interface and use that instead of the concrete bean (inside producer method return type and normal code).