Add default public constructor to generated builder inner class - java

I am using Lombok framework for boilerplate code generation, for example:
import lombok.*;
#Builder
#Value
public final class SocketConfig {
#Builder.Default
private int soTimeoutMilliseconds = 0;
#Builder.Default
private boolean soReuseAddress = false;
#Builder.Default
private int soLingerSeconds = -1;
private boolean soKeepAlive;
#Builder.Default
private boolean tcpNoDelay = false;
}
In order to create builder instances I used to invoke SocketConfig.builder(). But for better integration with spring beans creation I tried to create a FactoryBean. But got a compilation error due to lack of default constructor on the builder class, didn't find any documentation about it. Is it possible with Lombok? I mean to create a default constructor on the builder not on the original class. In other words, I want 2 options to create the builder instance: SocketConfig.builder() or through new SocketConfig.SocketConfigBuilder().
import org.springframework.beans.factory.FactoryBean;
public class SocketConfigFactoryBean extends SocketConfig.SocketConfigBuilder implements FactoryBean<SocketConfig> {
#Override
public SocketConfig getObject() throws Exception {
return build();
}
#Override
public Class<?> getObjectType() {
return SocketConfig.class;
}
#Override
public boolean isSingleton() {
return false;
}
}

Use the annotation NoArgsConstructor:
Generates a no-args constructor. Will generate an error message if
such a constructor cannot be written due to the existence of final
fields.
Read also this.

Related

How to correctly add a method to an existing class with Byte Buddy

I am facing the same issue as the one described here redefining method with Byte Buddy, however I am not sure how to adapt the solution to my use case:
I am trying to implement the active record pattern by delegating the method implementations to an
interceptor. The ActiveRecord base class is defined as follows:
public class ActiveRecord {
private Long id;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
private static IllegalStateException implementationMissing() {
return new IllegalStateException(
"This method must be overridden in subclasses");
}
public static Long count(){
throw implementationMissing();
}
public void save(){
throw implementationMissing();
}
// extra methods omitted
}
A child class would then extend active record as follows:
class MapText extends ActiveRecord{
private String text;
private String description;
private double wgs84Latitude;
private double wgs84Longitude;
// getters and setters omitted
}
Using Byte Buddy, I am trying to delegate the count and save methods to an interceptor class as follows:
#Test
void testRedefine(){
ByteBuddyAgent.install();
new ByteBuddy().redefine(MapText.class)
.defineMethod("save", void.class, Visibility.PUBLIC)
.intercept(MethodDelegation.to(ActiveRecordInterceptor.class))
.defineMethod("count", Long.class, Visibility.PUBLIC)
.intercept(MethodDelegation.to(ActiveRecordInterceptor.class))
.make()
.load(MapText.class.getClassLoader(), ClassReloadingStrategy.fromInstalledAgent());
MapText mapText = new MapText();
// set properties
mapText.save();
MapText.count();
}
Which generates the following exception:
java.lang.UnsupportedOperationException: class redefinition failed: attempted to add a method
If I add empty "placeholder" methods for save() and count() in MapText, then everything works fine.
How should I adapt my code to make the delegation work without requiring empty placeholder methods in the subclass?
Edit: changed the code to use the AgentBuilder API according to feedback
#Test
void testRedefine(){
ByteBuddyAgent.install();
new AgentBuilder.Default()
.disableClassFormatChanges()
.with(AgentBuilder.RedefinitionStrategy.REDEFINITION)
.type(ElementMatchers.named("pkg.MapText"))
.transform(new AgentBuilder.Transformer() {
#Override
public DynamicType.Builder<?> transform(DynamicType.Builder<?> builder, TypeDescription typeDescription, ClassLoader classLoader, JavaModule javaModule) {
return builder.defineMethod("save", void.class, Visibility.PUBLIC)
.intercept(MethodDelegation.to(ActiveRecordInterceptor.class));
}
}).with(new ListenerImpl()).installOnByteBuddyAgent();
CallTextSave callTextSave = new CallTextSave();
callTextSave.save();
}
CallTextSave encapsulates the MapText class and calls it save method. Unfortunately MapText.save() is not intercepted.
public class CallTextSave {
public void save(){
MapText text = new MapText();
text.save(); // Method not intercepted
}
}
If you want to alter code this way, you would need to do this before it is loaded for the first time. You can do so by defining a Java agent using the AgentBuilder API. You must avoid referring to the loaded class in the agent code, rather use named for a matcher that takes the string name as an argument.
Alternatively, you can redefine the class in your main method by resolving the class using a TypePool.Default. Again, resolve the TypeDescription by the name and avoid loading it. Also, move the actual code to a different class as the JVM validator will otherwise load the class in question.
This latter approach is only possible if you control the life cycle of your application.

Implementing shared logic for multiple KafkaListeners in spring-kafka

My Spring Boot application contains several #KafkaListeners, and each listener performs the same steps before and after actually processing the payload: Validate the payload, check whether the event has been processed already, check whether it's a tombstone (null) message, decide whether processing should be retried in case of failure, emit metrics, etc.
These steps are currently implemented in a base class, but because the topics passed to #KafkaListener must be constant at runtime, the method annotated with #KafkaListener is defined in the subclass, and does nothing but pass its parameters to a method in the base class.
This works just fine, but I wonder if there's a more elegant solution. I assume my base class would have to create a listener container programmatically, but after a quick look at KafkaListenerAnnotationBeanPostProcessor, it seems to be quite involved.
Does anyone have any recommendadtions?
Having stumbled upon this question while looking to implement something similar, I first started with Artem Bilan's answer. However this did not work because annotations by default are not inherited in child classes unless they are themselves annotated with #Inherited. Despite this there may yet be a way to make an annotation approach work and I will update this answer if and when I get it to work. Thankfully though I have achieved the desired behavour using programtic registration of the Kafka listeners.
My code is something like the following:
Interface:
public interface GenericKafkaListener {
String METHOD = "handleMessage";
void handleMessage(ConsumerRecord<String, String> record);
}
Abstract Class:
public abstract class AbstractGenericKafkaListener implements GenericKafkaListener {
private final String kafkaTopic;
public AbstractGenericKafkaListener(final String kafkaTopic) {
this.kafakTopic = kafkaTopic;
}
#Override
public void handleMessage(final ConsumerRecord<String, String> record) {
//do common logic here
specificLogic(record);
}
protected abstract specificLogic(ConsumerRecord<String, String> record);
public String getKafkaTopic() {
return kafkaTopic;
}
}
We can then programtically register all beans of type AbstractGenericKafkaListener in a KafkaListenerConfigurer:
#Configuration
public class KafkaListenerConfigurataion implements KafkaListenerConfigurer {
#Autowired
private final List<AbstractGenericKafkaListener> listeners;
#Autowired
private final BeanFactory beanFactory;
#Autowired
private final MessageHandlerMethodFactory messageHandlerMethodFactory;
#Autowired
private final KafkaListenerContainerFactory kafkaListenerContainerFactory;
#Value("${your.kafka.consumer.group-id}")
private String consumerGroup;
#Value("${your.application.name}")
private String service;
#Override
public void configureKafkaListeners(
final KafkaListenerEndpointRegistrar registrar) {
final Method listenerMethod = lookUpMethod();
listeners.forEach(listener -> {
registerListenerEndpoint(listener, listenerMethod, registrar);
});
}
private void registerListenerEndpoint(final AbstractGenericKafkaListener listener,
final Method listenerMethod,
final KafkaListenerEndpointRegistrar registrar) {
log.info("Registering {} endpoint on topic {}", listener.getClass(),
listener.getKafkaTopic());
final MethodKafkaListenerEndpoint<String, String> endpoint =
createListenerEndpoint(listener, listenerMethod);
registrar.registerEndpoint(endpoint);
}
private MethodKafkaListenerEndpoint<String, String> createListenerEndpoint(
final AbstractGenericKafkaListener listener, final Method listenerMethod) {
final MethodKafkaListenerEndpoint<String, String> endpoint = new MethodKafkaListenerEndpoint<>();
endpoint.setBeanFactory(beanFactory);
endpoint.setBean(listener);
endpoint.setMethod(listenerMethod);
endpoint.setId(service + "-" + listener.getKafkaTopic());
endpoint.setGroup(consumerGroup);
endpoint.setTopics(listener.getKafkaTopic());
endpoint.setMessageHandlerMethodFactory(messageHandlerMethodFactory);
return endpoint;
}
private Method lookUpMethod() {
return Arrays.stream(GenericKafkaListener.class.getMethods())
.filter(m -> m.getName().equals(GenericKafkaListener.METHOD))
.findAny()
.orElseThrow(() ->
new IllegalStateException("Could not find method " + GenericKafkaListener.METHOD));
}
}
How about this:
public abstract class BaseKafkaProcessingLogic {
#KafkaHandler
public void handle(Object payload) {
}
}
#KafkaListener(topics = "topic1")
public class Topic1Handler extends BaseKafkaProcessingLogic {
}
#KafkaListener(topics = "topic2")
public class Topic2Handler extends BaseKafkaProcessingLogic {
}
?
I needed the same functionality and came up with solution close to Artem Bilan answer. Yes, #KafkaHandler annotation is not inherited by the child classes but defined in interface it is. Here is the solution:
interface AbstractKafkaListener<T> {
default Class<T> getCommandType() {
TypeToken<T> type = new TypeToken<>(getClass()) {};
return (Class<T>) type.getRawType();
}
#KafkaHandler
default void handle(String message) throws JsonProcessingException {
ObjectMapper objectMapper = new ObjectMapper();
T value = objectMapper.readValue(message, getCommandType());
handle(value);
}
void handle(T message);
}
The class should implement the handle method only:
#Component
#KafkaListener(topics = "my_topic")
public class KafkaListenerForMyCustomMessage implements AbstractKafkaListener<MyCustomMessage> {
#Override
public void handle(MyCustomMessage message) {
System.out.println(message);
}
}
The 2 implemented methods in the interface should be private/protected but because they are in interface this cannot be done. default methods are always public. Actually, all methods defined in interface are always public.
I use this solution to dynamically parse the message from kafka (received in String) to the custom class.
getCommandType method returns the class of the T generic param. TypeToken is from Google Guava package.

#XmlJavaTypeAdapter not being called when used with #XmlSeeAlso

I can't get the XmlJavaTypeAdapter working when it is set at class-level on a subclass of an abstract class annotated with XmlSeeAlso. The serialization is done but the adapter is never being called.
#XmlRootElement(name = "root")
public class TopLevelClassBeingSerialized {
private Set<MyAbstractClass> set = new HashSet();
}
#XmlSeeAlso({MyImplClass.class})
//#XmlJavaTypeAdapter(MyAbstractClassAdapter.class)
public class MyAbstractClass {
}
#XmlJavaTypeAdapter(MyImplClassAdapter.class)
public class MyImplClass {
private int test = 2;
}
When the comment is uncommented, the adapter is called. I have also tried the setAdapter() method and it doesn't work. Is this a limitation of JAXB? If yes what's the best and cleanest workaround?

Dagger 2 - injecting multiple objects of same type using #Named not working

My module:
#Module
public class TcpManagerModule {
private ITcpConnection eventsTcpConnection;
private ITcpConnection commandsTcpConnection;
public TcpManagerModule(Context context) {
eventsTcpConnection = new EventsTcpConnection(context);
commandsTcpConnection = new CommandsTcpConnection(context);
}
#Provides
#Named("events")
public ITcpConnection provideEventsTcpConnection() {
return eventsTcpConnection;
}
#Provides
#Named("commands")
public ITcpConnection provideCommandsTcpConnection() {
return commandsTcpConnection;
}
}
Component:
#Component(modules = TcpManagerModule.class)
public interface TcpManagerComponent {
void inject(ITcpManager tcpManager);
}
class where injection happens:
public class DefaultTcpManager implements ITcpManager {
private TcpManagerComponent tcpComponent;
#Inject #Named("events") ITcpConnection eventsTcpConnection;
#Inject #Named("commands") ITcpConnection commandsTcpConnection;
public DefaultTcpManager(Context context){
tcpComponent = DaggerTcpManagerComponent.builder().tcpManagerModule(new TcpManagerModule(context)).build();
tcpComponent.inject(this);
}
#Override
public void startEventsConnection() {
eventsTcpConnection.startListener();
eventsTcpConnection.connect();
}
}
When I call startEventsConnection, I get NullPointerException - meaning the injection didn't populate the fields.
I followed the example exactly the way it is on the Docs, what is the issue?
Note: on the builder line
tcpComponent = DaggerTcpManagerComponent.builder().tcpManagerModule(new TcpManagerModule(context)).build();
I have a warning saying "tcpManagerModule is deprecated". I read the answer here about this issue, and its saying
It is safe to say that you can just ignore the deprecation. It is intended to notify you of unused methods and modules. As soon as you actually require / use Application somewhere in your subgraph the module is going to be needed, and the deprecation warning will go away.
So, am I not requiring/using the instances? What is the issue here?
You could try changing your Component defining the specific class for injection:
#Component(modules = TcpManagerModule.class)
public interface TcpManagerComponent {
void inject(DefaultTcpManager tcpManager);
}
So that Dagger knows exactly about DefaultTcpManager.class.

meanbean: failed to test bean with arrays

I am using mean bean 'http://meanbean.sourceforge.net' to validate/test my beans. It works fine for most of the beans. But when a bean has arrays in it, it is failing with following error.
SEVERE: getFactory: Failed to find suitable Factory for property=[hostNames] of type=[class [I]. Please register a custom Factory. Throw NoSuchFactoryException.
org.meanbean.factories.ObjectCreationException: Failed to instantiate object of type [[I] due to NoSuchMethodException.
Following is my sample code.
public class Machine {
private String[] hostNames;
public String[] getHostNames() {
return hostNames;
}
public void setHostNames(String[] hostNames) {
this.hostNames = hostNames;
}
}
import org.junit.Test;
import org.meanbean.test.BeanTester;
public class TestBeanUtil {
#Test
public void test1(){
new BeanTester().testBean(Machine.class);
}
}
Any help on how to get rid of this error. I found one way by ignoring specific fields like below.
Configuration configuration = new ConfigurationBuilder().ignoreProperty("hostNames").build();
new BeanTester().testBean(Machine.class, configuration);
But My concern is is there any way that i can test without ignoring specific proper (or) ignore all the arrays in one shot ?
You can create a custom factory for your field:
class HostNamesFactory implements Factory<String[]> {
#Override
public String[] create() {
return new String[] {"host1", "host2", "host3"};
}
}
Then use this factory when you create your custom configuration and pass it to the bean tester:
Configuration configuration = new ConfigurationBuilder().overrideFactory("hostNames", new HostNamesFactory()).build();
new BeanTester().testBean(Machine.class, configuration);
I am agree this is not the perfect solution but at least the property getter and setter will be tested.
Maybe too late, but you can include in the AbstractJavaBeanTest the factories of each property that cannot be created.
Here is a sample when you need a factory for LocalDateTime and UUID
#Test
public void getterAndSetterCorrectness() throws Exception
{
final BeanTester beanTester = new BeanTester();
beanTester.getFactoryCollection().addFactory(LocalDateTime.class,
new LocalDateTimeFactory());
beanTester.getFactoryCollection().addFactory(UUID.class,
new ExecutionUUIDFactory());
beanTester.testBean(getBeanInstance().getClass());
}
Then you just define the custom factories that you want (in this example)
/**
* Concrete Factory that creates a LocalDateTime.
*/
class LocalDateTimeFactory implements Factory<LocalDateTime>
{
#Override
public LocalDateTime create()
{
return LocalDateTime.now();
}
}
/**
* Concrete Factory that creates a UUID.
*/
class ExecutionUUIDFactory implements Factory<UUID>
{
#Override
public UUID create()
{
return UUID.randomUUID();
}
}

Categories

Resources