UDP server using Spring+Netty - java

I'm trying to setup a simple UDP server using Netty following the example here but using Spring for wiring dependencies.
My Spring config class:
#Configuration
#ComponentScan("com.example.netty")
public class SpringConfig {
#Value("${netty.nThreads}")
private int nThreads;
#Autowired
private MyHandlerA myHandlerA;
#Autowired
private MyHandlerB myHandlerB;
#Bean(name = "bootstrap")
public Bootstrap bootstrap() {
Bootstrap b = new Bootstrap();
b.group(group())
.channel(NioDatagramChannel.class)
.handler(new ChannelInitializer<DatagramChannel>() {
#Override
protected void initChannel(DatagramChannel ch) throws Exception {
ch.pipeline().addLast(myHandlerA, myHandlerB);
}
});
return b;
}
#Bean(name = "group", destroyMethod = "shutdownGracefully")
public NioEventLoopGroup group() {
return new NioEventLoopGroup(nThreads);
}
#Bean
public static PropertySourcesPlaceholderConfigurer propertyPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
My server class:
#Component
public class MyUDPServer {
#Autowired
private Bootstrap bootstrap;
#Value("${host}")
private String host;
#Value("${port}")
private int port;
#PostConstruct
public void start() throws Exception {
bootstrap.bind(host, port).sync().channel().closeFuture().await();
/* Never reached since the main thread blocks due to the call to await() */
}
}
During the blocking call to await(), I don't see my application listening on the specified interface. I've tried to run the sample (setting up the server directly from the main function) and it works. I didn't find examples for setting up a UDP server using Netty and Spring.
Thanks, Mickael
EDIT:
In order to avoid blocking the Main thread (which is used for Spring configuration), I've created a new thread as follows:
#Component
public class MyUDPServer extends Thread {
#Autowired
private Bootstrap bootstrap;
#Value("${host}")
private String host;
#Value("${port}")
private int port;
public MyUDPServer() {
setName("UDP Server");
}
#PostConstruct
#Override
public synchronized void start() {
super.start();
}
#Override
public void run() {
try {
bootstrap.bind(host, port).sync().channel().closeFuture().await();
} catch (InterruptedException e) {
} finally {
bootstrap.group().shutdownGracefully();
}
}
#PreDestroy
#Override
public void interrupt() {
super.interrupt();
}
}
I can see the new thread is blocked waiting for Channel close (as in the example). The Main thread can continue Spring configuration. However, it still doesn't work.

There is no need to wait for termination of the channel in #PostConstruct. Try to remove await().

Related

Spring live reload for ThreadPoolExecutor

i am not really familiar with Spring's Live Reload feature. However, i noticed that every time i have changes saved on my Java code. Java profiles (JMC) or Eclipse's (Debug Mode) shows that the thread i already spawned was spawned again resulting on 2 threads (1 thread before reload + 1 thread after reload).
I am currently using ThreadPoolExecutor to spawned my threads. basically i am letting spring manage my threads. In this case, How to i force shutdown/interrupt the threads i spawned when there is live reload occuring?
Below is my source.
ApplicationThreadingConfiguration.java
#Configuration
public class ApplicationThreadingConfiguration {
private static final Logger logger = LoggerFactory.getLogger(ApplicationThreadingConfiguration.class);
#Autowired
MyProperties prop;
#Bean(name = "myThread")
public TaskExecutor taskExecutor() {
logger.info(prop.toString());
final ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(prop.getCorePoolSize());
executor.setMaxPoolSize(prop.getMaxPoolSize());
executor.setQueueCapacity(prop.getQueueCapacity());
executor.setThreadNamePrefix("MyThread-");
executor.setWaitForTasksToCompleteOnShutdown(true);
executor.initialize();
return executor;
}
}
AppEventConfiguration.java (this spawns my thread after Spring context is loaded)
#Configuration
public class AppEventConfiguration {
private static final Logger logger = LoggerFactory.getLogger(AppEventConfiguration.class);
#Autowired
private MyService service;
#Autowired
private ApplicationContext context;
#Autowired
#Qualifier("myThread")
private TaskExecutor executor;
#EventListener(ApplicationReadyEvent.class)
public void onApplicationReadyEvent() {
service.getSomethingFromDB().stream().forEach(dto -> {
logger.info("Id: {}", dto.getId());
MyRunnableThread t = this.context.getBean(MyRunnableThread.class);
t.setMyId(dto.getId());
executor.execute(t);
});
}
}
MyRunnableThread.java
#Component
#Scope("prototype")
public class MyRunnableThread implements Runnable {
private static final Logger logger = LoggerFactory.getLogger(MyRunnableThread.class);
private long myId;
#Override
public void run() {
while(true) {
try {
logger.info("Do something here on ID: {}",this.myId);
Thread.sleep(5000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
public void setMyId(long id) {
this.myId = myId;
}
}
myproperties.properties
myprop.core-pool-size=80
myprop.max-pool-size=100
myprop.queue-capacity=100

Spring Boot: How to Update Beans?

I have a #Bean which calls an external API at the start of the application. How can I have it such that it makes a new call and updates the bean on a set timer?
#Bean
public Template apiCall()
{
final String uri = "http://...";
return new RestTemplate().getForObject(uri,Template.class);
}
One way is reload the template in some kind of factory bean.
#Component
public class TemplateProvider {
#Value("${template.uri}")
private String uri;
private Template template;
#Autowired
private RestTemplate restTemplate;
#PostConstruct
init () {
loadTemplate();
}
public synchronized void reset() {
loadTemplate();
}
public synchronized Template template() {
return template;
}
private void loadTemplate() {
try {
this.template = restTemplate.getFor...();
}
catch (Exception e) {
//
}
}
}
Then you can call reset() inside a #scheduled method.
The only drawback to this is that callers should not keep reference of Template in their state.. Always access template via the provider to avoid inconsistency problems.
public class Client {
#Autowired
private TemplateProvider templateProvider;
public void method() {
templateProvider.template().method();
}
}

Is it possible give an EJB service call a callback

Is it possible do make an EJB service which accepts callbacks and call it on the
client which invokes the service? The use case is: Uploading a large byte array
to the service which will parse it and transform the result into Objects and
persist them. I want to notify the client which of these steps are done.
#Local
public interface MyService {
Status upload(byte[] content, Callable<Void> onReceived, Calable<Void> onPersisting);
}
#Stateless(name = "MyService")
public class MyServiceImpl extends MyService {
Status upload(byte[] content, Callable<Void> onReceived, Calable<Void> onPersisting) {
// Invoke this because all date is transfered to server.
onReceived.call();
// Do the parsing stuff ...
onPersisting.call();
// Do the persisting stuff ...
return new Status(...); // Done or failed or such.
}
}
On the client I pass in the callables:
Context ctx = ...
MyService service = ctx.get(...);
ctx.upload(bytes, new Callable<void() {
#Override
public Void call() {
// Do something
return null;
}
}, new Callable<Void>() {
#Override
public Void call() {
// Do something
return null;
}
});
Is something like that possible in EJB?
I'm new to the JEE world: I know that the client get some stubs of the EJB
interface and the calls are transfered by "background magic" to the servers
real EJB implementation.
Case 1: Using a local business interface (or no-interface view)
Yes it's possible as long as your service is only accessed by an local business interface. Why? A local business interface can be only accessed by local client.
A local client has these characteristics [LocalClients].
It must run in the same application as the enterprise bean it accesses.
It can be a web component or another enterprise bean.
To the local client, the location of the enterprise bean it accesses is not transparent.
To summarize the important characteristics. It run in the same application respectively in the same JVM, it's a web or EJB component and that the location of the accessed bean is not transparent for the local client. Please take a look at LocalClients for more details.
Below a simple Hello World example. My example uses a no-interface view this is equivalent to a local business interface.
Edit: Example expanded by JNDI lookup.
/** Service class */
import javax.ejb.Stateless;
#Stateless
public class Service {
public void upload(final Callback callback) {
callback.call();
}
}
/** Callback class */
public class Callback {
public void call() {
System.out.println(this + " called.");
}
}
/** Trigger class */
import javax.ejb.EJB;
import javax.ejb.Schedule;
import javax.ejb.Singleton;
#Singleton
public class Trigger {
#EJB
Service service;
#Schedule(second = "*/5", minute = "*", hour = "*", persistent = false)
public void triggerService() {
System.out.println("Trigger Service call");
service.upload(new Callback());
//or by JNDI lookup and method overriding
try {
Service serviceByLookup = (Service) InitialContext.doLookup("java:module/Service");
serviceByLookup.upload(new Callback() {
#Override
public void call() {
System.out.println("Overriden: " + super.toString());
}
});
} catch (final NamingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
It's also possible to implement the Callback class as StatelessBean and inject it in the Service class.
/** Service class */
#Stateless
public class Service {
#EJB
Callback callback;
public void upload() {
callback.call();
}
}
Case 2: Using a remote business interface
If you are using a remote interface it's not possible to pass a callback object to your EJB. To get status information back to your client you have to use JMS.
Below a short kick-off example.
#Remote
public interface IService {
void upload();
}
#Stateless
public class Service implements IService {
#EJB
private AsyncUploadStateSender uploadStateSender;
#Override
public void upload() {
for (int i = 0; i <= 100; i += 10) {
uploadStateSender.sendState(i);
try {
Thread.sleep(1000L);
} catch (final InterruptedException e) {
e.printStackTrace();
}
}
}
}
#Stateless
public class AsyncUploadStateSender {
#Resource(lookup = "jms/myQueue")
private Queue queue;
#Inject
private JMSContext jmsContext;
#Asynchronous
public void sendState(final int state) {
final JMSProducer producer = jmsContext.createProducer();
final TextMessage msg = jmsContext.createTextMessage("STATE CHANGED " + state + "%");
producer.send(queue, msg);
}
}
public class Client {
public static void main(final String args[]) throws NamingException, InterruptedException, JMSException {
final InitialContext ctx = ... // create the InitialContext;
final IService service = (IService) ctx.lookup("<JNDI NAME OF IService>");
final ConnectionFactory factory = (ConnectionFactory) ctx.lookup("jms/__defaultConnectionFactory");
final Queue queue = (Queue) ctx.lookup("jms/myQueue");
// set consumer
final Connection connection = factory.createConnection();
final MessageConsumer consumer = connection.createSession().createConsumer(queue);
consumer.setMessageListener(new MessageListener() {
#Override
public void onMessage(final Message msg) {
try {
System.out.println(((TextMessage) msg).getText());
} catch (final JMSException e) {
e.printStackTrace();
}
}
});
connection.start();
// start upload
service.upload();
Thread.sleep(1000L);
}
}
Note: you have to create the queue jms/myQueue and the connection factory jms/__defaultConnectionFactory in your application server to make the example work.

Aspect does not get triggered in the context of listening for a RabbitMQ message

The FailedMessageAspect.afterMethod() below gets called successfully during RabbitConsumerMain.main() below. However, it doesn't get called when it's used in the context of listening for a RabbitMQ message - when MessageHandlerImpl.handleMesasge() receives a message from a RabbitMQ queue. Any idea why?
FailedMessageAspect.java
#Aspect
#Component
public class FailedMessageAspect {
#AfterReturning("execution(* com..MessageHandlerImpl.testAspect(..))")
private void afterMethod() {
System.out.println("aspect foo");
}
}
MessageHandlerImpl.java
#Component
public class MessageHandlerImpl implements MessageHandler {
#Override
public void testAspect() {
System.out.println("handler foo");
}
#Override
public void handleMessage(String message) {
// handleMessage is called successfully when message is received
testAspect();
// FailedMessageAspect.afterMethod() does not get called
}
}
RabbitConsumerMain.java
#Controller
#SpringBootApplication
public class RabbitConsumerMain implements CommandLineRunner {
#Autowired
private MessageHandler messageHandler;
public static void main(String[] args) throws Exception {
SpringApplication.run(RabbitConsumerMain.class, args);
}
#Override
public void run(String... args) {
messageHandler.testAspect();
//FailedMessageSpect.afterMethod() gets called right here
}
}
ConsumerConfiguration.java
#Configuration
public class ConsumerConfiguration {
#Autowired #Lazy
private MessageHandler messageHandler;
//other standard AMQP configs
#Bean
public MessageListenerContainer messageListenerContainer() {
SimpleMessageListenerContainer container = new SimpleMessageListenerContainer();
container.setConnectionFactory(connectionFactory());
container.setQueues(workQueue());
MessageListenerAdapter adapter = new MessageListenerAdapter(messageHandler, new Jackson2JsonMessageConverter());
container.setMessageListener(adapter);
return container;
}
}
You don't show all your configuration but, just to be clear, Spring AOP does not advise internal method calls such as handleMessage calling testAspect() within the same class instance.
You need to use AspectJ for that; otherwise, all methods you advise must be public methods invoked via bean definitions, so Spring can invoke the method via a proxy. Internal calls within a bean are never advised.
See the reference manual for a complete explanation.

How to manage/stop spring 4 ConcurrentTaskScheduler

I am using Spring 4. I use this for execute a task periodically for web sockets:
private TaskScheduler scheduler = new ConcurrentTaskScheduler();
In my class:
#Configuration
#EnableWebSocketMessageBroker
#EnableScheduling
#Component
public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {
#Autowired
private SimpMessagingTemplate template;
private TaskScheduler scheduler = new ConcurrentTaskScheduler();
public void registerStompEndpoints(StompEndpointRegistry registry) {
registry.addEndpoint("/simplemessages").withSockJS();
}
public void configureMessageBroker(MessageBrokerRegistry config) {
config.enableSimpleBroker("/topic/", "/queue/");
config.setApplicationDestinationPrefixes("/app");
}
#PostConstruct
private void broadcastTimePeriodically() {
scheduler.scheduleAtFixedRate(new Runnable() {
public void run() {
String statStr = "Server Response" + new Date();
System.out.println("thread schedular run time :" + Hello.printTime());
try {
template.convertAndSend("/topic/simplemessagesresponse", statStr);
} catch (MessagingException e) {
System.err.println("!!!!!! websocket timer error :>" + e.toString());
}
}
}, 4000));
}
#PreDestroy
private void destroyServices() {
scheduler = null; // how to destroy ?
}
public void configureClientInboundChannel(ChannelRegistration registration) {
}
public void configureClientOutboundChannel(ChannelRegistration registration) {
registration.taskExecutor().corePoolSize(4).maxPoolSize(10);
}
public boolean configureMessageConverters(List < MessageConverter > arg0) {
// TODO Auto-generated method stub
return true;
}
#Override
public void configureWebSocketTransport(WebSocketTransportRegistration arg0) {
}
}
I want to know to things:
I found that the scheduler is running twice within 4000 milliseconds. How is it happening and how can I stop it?
I run this application in tomcat. As you can see, the method destroyServices() needs to destroy the schedular. Here the problem is, even the tomcat is restarted again, previously running thread is still running. So when the tomcat is going to down, that thread also should be terminated. I need to know How I can destroy it on tomcat is going to down or any system crash?
The following code snippet is from documentation of #EnableScheduling:
#Configuration
#EnableScheduling
public class AppConfig implements SchedulingConfigurer {
#Override
public void configureTasks(ScheduledTaskRegistrar taskRegistrar) {
taskRegistrar.setScheduler(taskExecutor());
}
#Bean(destroyMethod="shutdown")
public Executor taskExecutor() {
return Executors.newScheduledThreadPool(100);
}
}
I think you should get the bean named taskExecutor (in this case) and call shutdown (in fact depending on your configuration) method of it.

Categories

Resources