How to use a custom ErrorMessageSendingRecoverer in Spring Cloud Stream - java

I'm using kafka binder if it matters.
I want to add some custom logic to the recover method of ErrorMessageSendingRecoverer (particularly - to modify a message a bit before publishing it to the error channel with adding new headers). Seems that extending from this class with overriding this method and register it as a bean would be a good idea.
However, I cannot imagine a way to do it while using Spring Cloud Stream. In the code of AbstractMessageChannelBinder it's neither injected as a bean nor allows you to use any customizers - it's simply created with new.
How to solve this problem? Maybe there is another (intended) way to do the same?

Related

Integrating Google Guice support into a custom Java framework

So after a lot of searching and banging my head against a brick wall, I figured it was easier to put this question out there to see if anyone can help.
To set the backstory to this question:
I'm currently writing a custom Java framework for a project, which has a small internal DI system that just scans through a class, looks for any fields annotated with #Dependency (provided by the annotations package from my framework) and uses its internal Map to inject instances into those fields.
For reference, this framework is used by client code in the following way:
Client creates a new Singleton instance of the core manager
Client builds classes that extend a Base class and implement whatever logic they desire using the framework
Client registers their class with the manager (essentially: Manager#register(Class<? extends BaseClass)
The manager will then instantiate the client's class and perform its dependency injection as mentioned above
I'm currently working on building Guice support into this framework so that Clients who are using Guice in their code can get their dependencies from Guice if they so desire. I'd ideally like this to work alongside the framework's internal DI, however this is not a hard requirement. To note, the framework itself does not use Guice.
I've currently got two options on my hands from what I can tell:
Option 1: Get the client to pass their Guice injector into my Guice Handler class and let Guice create the classes via injector.getInstance
Option 2: Let the client's Guice injector create my Guice Handler class and have a MembersInjector injected into the handler to then call injectMembers on the class instances where required.
Option 1 from what I can tell is a bad approach, for reasons described here: (https://github.com/google/guice/wiki/InjectingTheInjector)
So I'm currently working towards Option 2. This would require the client code to do something like:
Manager manager = new Manager();
Injector injector = Guice.createInjector(new ClientModule();
manager.enableGuiceSupport(injector.getInstance(GuiceHandler.class) Or:
GuiceHandler guiceHandler = new GuiceHandler();
injector.injectMembers(guiceHandler);
manager.enableGuiceSupport(guiceHandler);
Obviously Guice needs to deal with the GuiceHandler class so that I can get the MembersInjector injected.
Looking at this, whilst a valid solution to the problem, its adding a requirement on the client that feels slightly labored. The aim of the framework is to reduce what the client code needs to do, not add to it.
So my question boils down to this:
Is my approach to Option 2 here a good way to move forward and I just accept that the client needs to "bootstrap" the GuiceHandler?
Or alternatively, is there just a better way to achieve the goal of adding Guice support?
Any thoughts, suggestions and constructive criticism is welcome!
(To everyone who's taken the time to read through all this to see if you can help, have a cookie: 🍪. I really appreciate your time!)

How to create producer in String Cloud Stream Functional Model (v3.1+)?

How can I create producer in Spring Cloud Stream Functional Model?
The following version is deprecated now.
#Output(OUTPUT)
MessageChannel outbound();
I know that it is possible to achieve by java Supplier functional class, but it will send message every one second. I don't need it to send every second. I am going to replace REST API with with Kafka.
Are there any ways to do that?
Use the StreamBridge - see Sending data to an arbitrary output.
Here we autowire a StreamBridge bean which allows us to send data to an output binding effectively bridging non-stream application with spring-cloud-stream. Note that preceding example does not have any source functions defined (e.g., Supplier bean) leaving the framework with no trigger to create source bindings, which would be typical for cases where configuration contains function beans. So to trigger the creation of source binding we use spring.cloud.stream.source property where you can declare the name of your sources.
If you want to trigger a stream from an external Kafka topic, you can also bind a spring cloud steam processor’s input to that topic. The stream bridge provides a layer of abstraction that may be cleaner, I.e., your non-stream application does not use the Kafka API directly.

Dependency injection and multiple instances

I am using spring framework for dependency injection, but I simply cannot find out, if I am using it right. Imagine this case - it is not real, but just to explain my problem. I have a spring boot application which connects with websocket to some endpoints. I have a class which has all available methods for this client, stores all needed data for client etc., let's say Client. Then I have a static list which holds all connected clients List<Client>. I need that the Client class is Spring managed bean, as I need to use #Service and all other spring features (#Value, #Async) etc.
The problem is, spring beans are singletons right? How can I instantiate then object from a class which should be spring managed but on the other hand there should be multiple instances of this class?? I cannot use new right?`
It isn't necessarily true that spring-created objects are singletons; this is merely the default. Spring supports a variety of different options for determining when a new object is created versus an old one being recycled. You should look at the documentation for the "scope" attribute and determine what is most appropriate for your application.
Alternatively, you can create the object yourself using new and then request Spring to configure it for you using the technique described at http://docs.spring.io/spring/docs/current/spring-framework-reference/html/aop.html#aop-atconfigurable

Should I use AOP to address this cross-cutting concern?

I've used Spring AOP before, but I'm not sure if that's the best method to go about this problem.
There's a service-layer class that has autowired DAOs to save an object. When an object is successfully saved, a message should be sent (SMS) to the object's supplied phone number.
Is it standard practice to keep the service unaware of the of the messaging bean using AOP, or to inject the bean into the service and send the message?
It totally depends on Business requirement, you can achieve the same thing using Interceptors too. Once object is saved you can call the interceptor after save and can send the message through it, making the service unaware of message sending part.
I'm not totally sold on this being a valid use of AOP (see AOP use cases? )
Personally, I have no problem with the service layer being aware of the SMS message. However, as also mentioned in this thread, to avoid code duplication, I would look at an Entity Listener: http://www.mastertheboss.com/jboss-frameworks/hibernate-jpa/interceptors/jpa-entity-listeners

Get application components without #Autowired

How would you extract something prior 2.5 version from .xml config? It bothers me because if #Autowired is removed from my arsenal I would not really know what to do.
Say I want to use some DAO implementation.
In service class I usually write:
#Autowired
someDaoInterface generalDao;
Then I typically call
generalDao.someInterfaceMethod(someParam param);
How would I extract implementation from config in Spring 2.0 to use this method?
Is it as dumb as just: new ApplicationContext(pathToXml) and then use .getBean or there is other way?
Why do I ask for taking bean out from configuration file?
Because in Spring MVC how can you perform your logic without getting beans out from the application context.
If you have #Controller handler then you need to make calls to the service classes' methods? So they should be somehow retrieved from the context and the only way so far is using #Autowired? Then I would also want to populate Service classes as I stated in previous example with DAO classes and they also need to be retrieved from the application context, so I would be able to write logic for service classes themself. How would people do it in the past?
I see the #Autowired as the only mean of taking something out, not because it is convenient to wire automatically - I am perfectly ok with XML.
You still have option to wire it explicitely via property or constructor parameter. (Anyway, autowired is not going to work if there is ambiguity in your container )
Of course, you can use application context and getBean() in your java code, but it violates DI pattern and makes all the spring stuff useless. Purpose of DI is to decouple your business loginc from implementation details - it's not business logic it's how and where it dependencies come from. Dependencies are just there.
By using ApplicationContext.getBean() you are breaking this pattern, and introduce dependency to:
spring itself
your configuration names
After you done this, you can as well drop use of DI and spring because you just voided all the advandages DI is providing to you. (BTW, #Autowired also introduces dependency to spring, and violates DI pattern, it also implies that there is only one instance available)
Also, answer is: in ideal case there shall be no reference to spring in your code at all.
No imports, no annotations - just interfaces of collaborating entities.

Categories

Resources