Proper way to pool client channels in netty? - java

I'm getting a java.nio.channels.NotYetConnectedException in the following code because I'm trying to write to a channel that is not yet open.
Essentially what I have is a channel pool in which I grab a channel to write to if one is free, and I create a new channel if one is not available. My problem is that when I create a new channel, the channel is not ready for writing when I call connect, and I don't want to wait for the connection to open before returning because I don't want to block the thread. What's the best way to do this? Also, is my logic for retrieving/returning channels valid? See code below.
I have a simple connection pool like the following:
private static class ChannelPool {
private final ClientBootstrap cb;
private Set<Channel> activeChannels = new HashSet<Channel>();
private Deque<Channel> freeChannels = new ArrayDeque<Channel>();
public ChannelPool() {
ChannelFactory clientFactory =
new NioClientSocketChannelFactory(
Executors.newCachedThreadPool(),
Executors.newCachedThreadPool());
cb = new ClientBootstrap(clientFactory);
cb.setPipelineFactory(new ChannelPipelineFactory() {
public ChannelPipeline getPipeline() {
return Channels.pipeline(
new HttpRequestEncoder(),
new HttpResponseDecoder(),
new ResponseHandler());
}
});
}
private Channel newChannel() {
ChannelFuture cf;
synchronized (cb) {
cf = cb.connect(new InetSocketAddress("localhost", 18080));
}
final Channel ret = cf.getChannel();
ret.getCloseFuture().addListener(new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture arg0) throws Exception {
System.out.println("channel closed?");
synchronized (activeChannels) {
activeChannels.remove(ret);
}
}
});
synchronized (activeChannels) {
activeChannels.add(ret);
}
System.out.println("returning new channel");
return ret;
}
public Channel getFreeChannel() {
synchronized (freeChannels) {
while (!freeChannels.isEmpty()) {
Channel ch = freeChannels.pollFirst();
if (ch.isOpen()) {
return ch;
}
}
}
return newChannel();
}
public void returnChannel(Channel ch) {
synchronized (freeChannels) {
freeChannels.addLast(ch);
}
}
}
I'm trying to use this inside a handler as follows:
private static class RequestHandler extends SimpleChannelHandler {
#Override
public void messageReceived(ChannelHandlerContext ctx, final MessageEvent e) {
final HttpRequest request = (HttpRequest) e.getMessage();
Channel proxyChannel = pool.getFreeChannel();
proxyToClient.put(proxyChannel, e.getChannel());
proxyChannel.write(request);
}
}

Instead of adding the new channel to activeChannels immediately after bootstrap.connect(..), you have to add a listener to the ChannelFuture which was returned by bootstrap.connect(..), and add the channel to activeChannels in the added listener. That way, getFreeChannel() will never get the channel that is not connected yet.
Because it is likely that activeChannels is empty even if you called newChannel() (newChannel() will return even before connection is established), you have to decide what to do in such a case. If I were you, I would change the return type of getFreeChannel() from Channel to ChannelFuture so that the caller gets notified when the free channel is ready.

Related

using netty library, server unable to handle the reconnection scenario of client

Initially able to make the connection. Simply close the connection client and try to connect again or restart the client. the connection is not established. It creates connection only once.
Can someone help me to improve it. So, it can handle n number client simultaneously.
bossGroup = new NioEventLoopGroup(1);
workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup).channel(NioServerSocketChannel.class).option(ChannelOption.SO_BACKLOG, 100)
.handler(new LoggingHandler(LogLevel.INFO)).childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
p.addLast(new DelimiterBasedFrameDecoder(20000, Delimiters.lineDelimiter()));
// p.addLast(new StringDecoder());
// p.addLast(new StringEncoder());
p.addLast(serverHandler);
}
});
// Start the server.
LOGGER.key("Simulator is opening listen port").low().end();
ChannelFuture f = b.bind(config.getPort()).sync();
LOGGER.key("Simulator started listening at port: " + config.getPort()).low().end();
// Wait until the server socket is closed.
f.channel().closeFuture().sync();
} finally {
// Shut down all event loops to terminate all threads.
LOGGER.key("Shtting down all the thread if anyone is still open.").low().end();
bossGroup.shutdownGracefully();
workerGroup.shutdownGracefully();
}
Server Handler code is below:
public class SimulatorServerHandler extends SimpleChannelInboundHandler<String> {
private AtomicReference<ChannelHandlerContext> ctxRef = new AtomicReference<ChannelHandlerContext>();
private final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
private AtomicInteger seqNum = new AtomicInteger(1);
private final Configuration configure;
private ScheduledFuture<?> hbTimerWorker;
private final int stx = 0x02;
private final int etx = 0x03;
private final ILogger LOGGER;
public int enablePublishFunction = 0;
public SimulatorServerHandler(Configuration config) {
this.configure = config;
//LOGGER = LogFactory.INSTANCE.createLogger();
LOGGER = new LogFactory().createLogger("SIM SERVER");
}
#Override
public void channelActive(ChannelHandlerContext ctx) throws Exception {
ctxRef.set(ctx);
enablePublishFunction =1;
// System.out.println("Connected!");
LOGGER.low().key("Gateway connected to the Simulator ").end();
startHBTimer();
}
#Override
public void channelInactive(ChannelHandlerContext ctx) throws Exception {
ctx.fireChannelInactive();
hbTimerWorker.cancel(false);
enablePublishFunction =0;
LOGGER.low().key("Gateway disconnected from the Simulator ").end();
}
#Override
public void channelRead0(ChannelHandlerContext ctx, String request) {
// Generate and write a response.
String response;
boolean close = false;
/* if (request.isEmpty()) {
response = "Please type something.\r\n";
} else if ("bye".equals(request.toLowerCase())) {
response = "Have a good day!\r\n";
close = true;
} else {
response = "Did you say '" + request + "'?\r\n";
}
// We do not need to write a ChannelBuffer here.
// We know the encoder inserted at TelnetPipelineFactory will do the conversion.
ChannelFuture future = ctx.write(response);
// Close the connection after sending 'Have a good day!'
// if the client has sent 'bye'.
if (close) {
future.addListener(ChannelFutureListener.CLOSE);
}
*/
System.out.println(request);
}
#Override
public void channelReadComplete(ChannelHandlerContext ctx) {
ctx.flush();
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
LOGGER.key("Unknown exception while network communication :"+ cause.getStackTrace()).high().end();
cause.printStackTrace();
ctx.close();
}
Maybe because you use always the very same server handler in your pipeline for all connections (not using new ServerHandler())? Side effects in your implementation could block your handler to be reusable.

How to get server response with netty client

I want to write a netty based client. It should have method public String send(String msg); which should return response from the server or some future - doesen't matter. Also it should be multithreaded. Like this:
public class Client {
public static void main(String[] args) throws InterruptedException {
Client client = new Client();
}
private Channel channel;
public Client() throws InterruptedException {
EventLoopGroup loopGroup = new NioEventLoopGroup();
Bootstrap b = new Bootstrap();
b.group(loopGroup).channel(NioSocketChannel.class).handler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(new StringDecoder()).
addLast(new StringEncoder()).
addLast(new ClientHandler());
}
});
channel = b.connect("localhost", 9091).sync().channel();
}
public String sendMessage(String msg) {
channel.writeAndFlush(msg);
return ??????????;
}
}
And I don't get how can I retrieve response from server after I invoke writeAndFlush(); What should I do?
Also I use Netty 4.0.18.Final
Returning a Future<String> for the method is simple, we are going to implement the following method signature:
public Futute<String> sendMessage(String msg) {
The is relatively easy to do when you are known with the async programming structures. To solve the design problem, we are going to do the following steps:
When a message is written, add a Promise<String> to a ArrayBlockingQueue<Promise>
This will serve as a list of what messages have recently been send, and allows us to change our Future<String> objects return result.
When a message arrives back into the handler, resolve it against the head of the Queue
This allows us to get the correct future to change.
Update the state of the Promise<String>
We call promise.setSuccess() to finally set the state on the object, this will propagate back to the future object.
Example code
public class ClientHandler extends SimpleChannelInboundHandler<String> {
private ChannelHandlerContext ctx;
private BlockingQueue<Promise<String>> messageList = new ArrayBlockingQueue<>(16);
#Override
public void channelActive(ChannelHandlerContext ctx) {
super.channelActive(ctx);
this.ctx = ctx;
}
#Override
public void channelInactive(ChannelHandlerContext ctx) {
super.channelInactive(ctx);
synchronized(this){
Promise<String> prom;
while((prom = messageList.poll()) != null)
prom.setFailure(new IOException("Connection lost"));
messageList = null;
}
}
public Future<String> sendMessage(String message) {
if(ctx == null)
throw new IllegalStateException();
return sendMessage(message, ctx.executor().newPromise());
}
public Future<String> sendMessage(String message, Promise<String> prom) {
synchronized(this){
if(messageList == null) {
// Connection closed
prom.setFailure(new IllegalStateException());
} else if(messageList.offer(prom)) {
// Connection open and message accepted
ctx.writeAndFlush(message).addListener();
} else {
// Connection open and message rejected
prom.setFailure(new BufferOverflowException());
}
return prom;
}
}
#Override
protected void messageReceived(ChannelHandlerContext ctx, String msg) {
synchronized(this){
if(messageList != null) {
messageList.poll().setSuccess(msg);
}
}
}
}
Documentation breakdown
private ChannelHandlerContext ctx;
Used to store our reference to the ChannelHandlerContext, we use this so we can create promises
private BlockingQueue<Promise<String>> messageList = new ArrayBlockingQueue<>();
We keep the past messages in this list so we can change the result of the future
public void channelActive(ChannelHandlerContext ctx)
Called by netty when the connection becomes active. Init our variables here.
public void channelInactive(ChannelHandlerContext ctx)
Called by netty when the connection becomes inactive, either due to error or normal connection close.
protected void messageReceived(ChannelHandlerContext ctx, String msg)
Called by netty when a new message arrives, here pick out the head of the queue, and then we call setsuccess on it.
Warning advise
When using futures, there is 1 thing you need to lookout for, do not call get() from 1 of the netty threads if the future isn't done yet, failure to follow this simple rule will either result in a deadlock or a BlockingOperationException.
You can find the sample in netty project.
We can save the result into the last handler's custom fields. In the following code, it is handler.getFactorial() that is what we want.
refer to http://www.lookatsrc.com/source/io/netty/example/factorial/FactorialClient.java?a=io.netty:netty-all
FactorialClient.java
public final class FactorialClient {
static final boolean SSL = System.getProperty("ssl") != null;
static final String HOST = System.getProperty("host", "127.0.0.1");
static final int PORT = Integer.parseInt(System.getProperty("port", "8322"));
static final int COUNT = Integer.parseInt(System.getProperty("count", "1000"));
public static void main(String[] args) throws Exception {
// Configure SSL.
final SslContext sslCtx;
if (SSL) {
sslCtx = SslContextBuilder.forClient()
.trustManager(InsecureTrustManagerFactory.INSTANCE).build();
} else {
sslCtx = null;
}
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioSocketChannel.class)
.handler(new FactorialClientInitializer(sslCtx));
// Make a new connection.
ChannelFuture f = b.connect(HOST, PORT).sync();
// Get the handler instance to retrieve the answer.
FactorialClientHandler handler =
(FactorialClientHandler) f.channel().pipeline().last();
// Print out the answer.
System.err.format("Factorial of %,d is: %,d", COUNT, handler.getFactorial());
} finally {
group.shutdownGracefully();
}
}
}
public class FactorialClientHandler extends SimpleChannelInboundHandler<BigInteger> {
private ChannelHandlerContext ctx;
private int receivedMessages;
private int next = 1;
final BlockingQueue<BigInteger> answer = new LinkedBlockingQueue<BigInteger>();
public BigInteger getFactorial() {
boolean interrupted = false;
try {
for (;;) {
try {
return answer.take();
} catch (InterruptedException ignore) {
interrupted = true;
}
}
} finally {
if (interrupted) {
Thread.currentThread().interrupt();
}
}
}
#Override
public void channelActive(ChannelHandlerContext ctx) {
this.ctx = ctx;
sendNumbers();
}
#Override
public void channelRead0(ChannelHandlerContext ctx, final BigInteger msg) {
receivedMessages ++;
if (receivedMessages == FactorialClient.COUNT) {
// Offer the answer after closing the connection.
ctx.channel().close().addListener(new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture future) {
boolean offered = answer.offer(msg);
assert offered;
}
});
}
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
cause.printStackTrace();
ctx.close();
}
private void sendNumbers() {
// Do not send more than 4096 numbers.
ChannelFuture future = null;
for (int i = 0; i < 4096 && next <= FactorialClient.COUNT; i++) {
future = ctx.write(Integer.valueOf(next));
next++;
}
if (next <= FactorialClient.COUNT) {
assert future != null;
future.addListener(numberSender);
}
ctx.flush();
}
private final ChannelFutureListener numberSender = new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture future) throws Exception {
if (future.isSuccess()) {
sendNumbers();
} else {
future.cause().printStackTrace();
future.channel().close();
}
}
};
}
Calling channel.writeAndFlush(msg); already returns a ChannelFuture. To handle the result of this method call, you could add a listener to the future like this:
future.addListener(new ChannelFutureListener() {
public void operationComplete(ChannelFuture future) {
// Perform post-closure operation
// ...
}
});
(this is taken from the Netty documentation see: Netty doc)

Netty client sometimes doesn't receive all expected messages

I have a fairly simple test Netty server/client project . I am testing some aspects of the stability of the communication by flooding the server with messages and counting the messages and bytes that I get back to make sure that everything matches.
When I run the flood from the client, the client keeps track of the number of messages it sends and how many it gets back and then when the number equal to each other it prints out some stats.
On certain occassions when running locally (I'm guessing because of congestion?) the client never ends up printing out the final message. I haven't run into this issue when the 2 components are on remote machines. Any suggestions would be appreciated:
The Encoder is just a simple OneToOneEncoder that encodes an Envelope type to a ChannelBuffer and the Decoder is a simple ReplayDecoder that does the opposite.
I tried adding a ChannelInterestChanged method to my client handler to see if the channel's interest was getting changed to not read, but that did not seem to be the case.
The relevant code is below:
Thanks!
SERVER
public class Server {
// configuration --------------------------------------------------------------------------------------------------
private final int port;
private ServerChannelFactory serverFactory;
// constructors ---------------------------------------------------------------------------------------------------
public Server(int port) {
this.port = port;
}
// public methods -------------------------------------------------------------------------------------------------
public boolean start() {
ExecutorService bossThreadPool = Executors.newCachedThreadPool();
ExecutorService childThreadPool = Executors.newCachedThreadPool();
this.serverFactory = new NioServerSocketChannelFactory(bossThreadPool, childThreadPool);
this.channelGroup = new DeviceIdAwareChannelGroup(this + "-channelGroup");
ChannelPipelineFactory pipelineFactory = new ChannelPipelineFactory() {
#Override
public ChannelPipeline getPipeline() throws Exception {
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("encoder", Encoder.getInstance());
pipeline.addLast("decoder", new Decoder());
pipeline.addLast("handler", new ServerHandler());
return pipeline;
}
};
ServerBootstrap bootstrap = new ServerBootstrap(this.serverFactory);
bootstrap.setOption("reuseAddress", true);
bootstrap.setOption("child.tcpNoDelay", true);
bootstrap.setOption("child.keepAlive", true);
bootstrap.setPipelineFactory(pipelineFactory);
Channel channel = bootstrap.bind(new InetSocketAddress(this.port));
if (!channel.isBound()) {
this.stop();
return false;
}
this.channelGroup.add(channel);
return true;
}
public void stop() {
if (this.channelGroup != null) {
ChannelGroupFuture channelGroupCloseFuture = this.channelGroup.close();
System.out.println("waiting for ChannelGroup shutdown...");
channelGroupCloseFuture.awaitUninterruptibly();
}
if (this.serverFactory != null) {
this.serverFactory.releaseExternalResources();
}
}
// main -----------------------------------------------------------------------------------------------------------
public static void main(String[] args) {
int port;
if (args.length != 3) {
System.out.println("No arguments found using default values");
port = 9999;
} else {
port = Integer.parseInt(args[1]);
}
final Server server = new Server( port);
if (!server.start()) {
System.exit(-1);
}
System.out.println("Server started on port 9999 ... ");
Runtime.getRuntime().addShutdownHook(new Thread() {
#Override
public void run() {
server.stop();
}
});
}
}
SERVER HANDLER
public class ServerHandler extends SimpleChannelUpstreamHandler {
// internal vars --------------------------------------------------------------------------------------------------
private AtomicInteger numMessagesReceived=new AtomicInteger(0);
// constructors ---------------------------------------------------------------------------------------------------
public ServerHandler() {
}
// SimpleChannelUpstreamHandler -----------------------------------------------------------------------------------
#Override
public void channelConnected(ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception {
Channel c = e.getChannel();
System.out.println("ChannelConnected: channel id: " + c.getId() + ", remote host: " + c.getRemoteAddress() + ", isChannelConnected(): " + c.isConnected());
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, ExceptionEvent e) throws Exception {
System.out.println("*** EXCEPTION CAUGHT!!! ***");
e.getChannel().close();
}
#Override
public void channelDisconnected(ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception {
super.channelDisconnected(ctx, e);
System.out.println("*** CHANNEL DISCONNECTED ***");
}
#Override
public void messageReceived(ChannelHandlerContext ctx, MessageEvent e) throws Exception {
if(numMessagesReceived.incrementAndGet()%1000==0 ){
System.out.println("["+numMessagesReceived+"-TH MSG]: Received message: " + e.getMessage());
}
if (e.getMessage() instanceof Envelope) {
// echo it...
if (e.getChannel().isWritable()) {
e.getChannel().write(e.getMessage());
}
} else {
super.messageReceived(ctx, e);
}
}
}
CLIENT
public class Client implements ClientHandlerListener {
// configuration --------------------------------------------------------------------------------------------------
private final String host;
private final int port;
private final int messages;
// internal vars --------------------------------------------------------------------------------------------------
private ChannelFactory clientFactory;
private ChannelGroup channelGroup;
private ClientHandler handler;
private final AtomicInteger received;
private long startTime;
private ExecutorService cachedThreadPool = Executors.newCachedThreadPool();
// constructors ---------------------------------------------------------------------------------------------------
public Client(String host, int port, int messages) {
this.host = host;
this.port = port;
this.messages = messages;
this.received = new AtomicInteger(0);
}
// ClientHandlerListener ------------------------------------------------------------------------------------------
#Override
public void messageReceived(Envelope message) {
if (this.received.incrementAndGet() == this.messages) {
long stopTime = System.currentTimeMillis();
float timeInSeconds = (stopTime - this.startTime) / 1000f;
System.err.println("Sent and received " + this.messages + " in " + timeInSeconds + "s");
System.err.println("That's " + (this.messages / timeInSeconds) + " echoes per second!");
}
}
// public methods -------------------------------------------------------------------------------------------------
public boolean start() {
// For production scenarios, use limited sized thread pools
this.clientFactory = new NioClientSocketChannelFactory(cachedThreadPool, cachedThreadPool);
this.channelGroup = new DefaultChannelGroup(this + "-channelGroup");
this.handler = new ClientHandler(this, this.channelGroup);
ChannelPipelineFactory pipelineFactory = new ChannelPipelineFactory() {
#Override
public ChannelPipeline getPipeline() throws Exception {
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("byteCounter", new ByteCounter("clientByteCounter"));
pipeline.addLast("encoder", Encoder.getInstance());
pipeline.addLast("decoder", new Decoder());
pipeline.addLast("handler", handler);
return pipeline;
}
};
ClientBootstrap bootstrap = new ClientBootstrap(this.clientFactory);
bootstrap.setOption("reuseAddress", true);
bootstrap.setOption("tcpNoDelay", true);
bootstrap.setOption("keepAlive", true);
bootstrap.setPipelineFactory(pipelineFactory);
boolean connected = bootstrap.connect(new InetSocketAddress(host, port)).awaitUninterruptibly().isSuccess();
System.out.println("isConnected: " + connected);
if (!connected) {
this.stop();
}
return connected;
}
public void stop() {
if (this.channelGroup != null) {
this.channelGroup.close();
}
if (this.clientFactory != null) {
this.clientFactory.releaseExternalResources();
}
}
public ChannelFuture sendMessage(Envelope env) {
Channel ch = this.channelGroup.iterator().next();
ChannelFuture cf = ch.write(env);
return cf;
}
private void flood() {
if ((this.channelGroup == null) || (this.clientFactory == null)) {
return;
}
System.out.println("sending " + this.messages + " messages");
this.startTime = System.currentTimeMillis();
for (int i = 0; i < this.messages; i++) {
this.handler.sendMessage(new Envelope(Version.VERSION1, Type.REQUEST, 1, new byte[1]));
}
}
// main -----------------------------------------------------------------------------------------------------------
public static void main(String[] args) throws InterruptedException {
final Client client = new Client("localhost", 9999, 10000);
if (!client.start()) {
System.exit(-1);
return;
}
while (client.channelGroup.size() == 0) {
Thread.sleep(200);
}
System.out.println("Client started...");
client.flood();
Runtime.getRuntime().addShutdownHook(new Thread() {
#Override
public void run() {
System.out.println("shutting down client");
client.stop();
}
});
}
}
CLIENT HANDLER
public class ClientHandler extends SimpleChannelUpstreamHandler {
// internal vars --------------------------------------------------------------------------------------------------
private final ClientHandlerListener listener;
private final ChannelGroup channelGroup;
private Channel channel;
// constructors ---------------------------------------------------------------------------------------------------
public ClientHandler(ClientHandlerListener listener, ChannelGroup channelGroup) {
this.listener = listener;
this.channelGroup = channelGroup;
}
// SimpleChannelUpstreamHandler -----------------------------------------------------------------------------------
#Override
public void messageReceived(ChannelHandlerContext ctx, MessageEvent e) throws Exception {
if (e.getMessage() instanceof Envelope) {
Envelope env = (Envelope) e.getMessage();
this.listener.messageReceived(env);
} else {
System.out.println("NOT ENVELOPE!!");
super.messageReceived(ctx, e);
}
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, ExceptionEvent e) throws Exception {
System.out.println("**** CAUGHT EXCEPTION CLOSING CHANNEL ***");
e.getCause().printStackTrace();
e.getChannel().close();
}
#Override
public void channelConnected(ChannelHandlerContext ctx, ChannelStateEvent e) throws Exception {
this.channel = e.getChannel();
System.out.println("Server connected, channel id: " + this.channel.getId());
this.channelGroup.add(e.getChannel());
}
// public methods -------------------------------------------------------------------------------------------------
public void sendMessage(Envelope envelope) {
if (this.channel != null) {
this.channel.write(envelope);
}
}
}
CLIENT HANDLER LISTENER INTERFACE
public interface ClientHandlerListener {
void messageReceived(Envelope message);
}
Without knowing how big the envelope is on the network I'm going to guess that your problem is that your client writes 10,000 messages without checking if the channel is writable.
Netty 3.x processes network events and writes in a particular fashion. It's possible that your client is writing so much data so fast that Netty isn't getting a chance to process receive events. On the server side this would result in the channel becoming non writable and your handler dropping the reply.
There are a few reasons why you see the problem on localhost but it's probably because the write bandwidth is much higher than your network bandwidth. The client doesn't check if the channel is writable, so over a network your messages are buffered by Netty until the network can catch up (if you wrote significantly more than 10,000 messages you might see an OutOfMemoryError). This acts as a natural break because Netty will suspend writing until the network is ready, allowing it to process incoming data and preventing the server from seeing a channel that's not writable.
The DiscardClientHandler in the discard handler shows how to test if the channel is writable, and how to resume when it becomes writable again. Another option is to have sendMessage return the ChannelFuture associated with the write and, if the channel is not writable after the write, block until the future completes.
Also your server handler should write the message and then check if the channel is writable. If it isn't you should set channel readable to false. Netty will notify ChannelInterestChanged when the channel becomes writable again. Then you can set channel readable to true to resume reading messages.

Handling ReadTimeoutHandler time out

I just can't realize why my read time out is not working. All I want to do is just to wait
for 10 seconds for some thread to put message to BlockedQueue<String> and on timeout return some kind of response on client.
public class NioAsynChatPipelineFactory implements ChannelPipelineFactory {
private static Timer timer = new HashedWheelTimer();
private final ChannelHandler timeoutHandler = new ReadTimeoutHandler(timer, 10);
#Override
public ChannelPipeline getPipeline() throws Exception {
ChannelPipeline pipeline = Channels.pipeline();
pipeline.addLast("decoder", new HttpRequestDecoder());
pipeline.addLast("encoder", new HttpResponseEncoder());
pipeline.addLast("handler", new NioAsynChatHandler());
pipeline.addLast("timeout", this.timeoutHandler);
return pipeline;
}
}
Now my handler looks like this.
public class NioAsynChatHandler extends SimpleChannelUpstreamHandler{
#Override
public void handleUpstream(
ChannelHandlerContext ctx, ChannelEvent e) throws Exception {
super.handleUpstream(ctx, e);
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, ExceptionEvent e)
throws Exception {
System.out.println("Exception");
\\writing some kind of response and closing channel.
}
#Override
public void messageReceived(ChannelHandlerContext ctx, MessageEvent e) throws Exception {
Thread thread = new Thread(new ConsmerTask(e.getChannel()));
thread.start();
}
and inside ConsumerTask I'm just waiting for BlockingQueue to get response
public class ConsumerTask implements Runnable{
private Channel channel;
public ConsumerTask(Channel channel){
this.channel = channel;
}
#Override
public void run() {
try{
while(true){
String message = queue.take();
}
} catch(InterruptedException ex){
Thread.currentThread.interrupt();
} finally{
//write something to channel and close it
}
}
My problem is that I don't see that any excpetion occurs on time out.
What am I doing wrong?
Update:
public static final BlockingQueue<String> blockingQueue = new LinkedBlockingQueue<String>();
Actually my question is more generic, How to close channel on timeout while it is waiting for something in external thread?
Update 2:
Another question: due to the fact that I'm running external thread in Cha would it be better to use OrderedMemoryAwareThreadPoolExecutor in pipeline? Will it increase performance.
It's basically because you put the ReadTimeoutHandler in the wrong position. Please put it in the first position of the pipeline (i.e. before all handlers).

Basic Netty echo server - string encoder error?

This is my first time using Netty and I'm having trouble making a simple echo server! I looked at docs and it says to use the string encoder and decoder, which I am not using properly apparently. For the framedecoder, I'd like to use the header messages with one byte length, but that doesn't seem to be working either due to the string issue. I assume my implementation of the PipelineFactory is messed up.
Bonus Question:
Because I'm stupid and ambitious, I tried implementing a timeout/heartbeat handler. That didn't work either.
Here are the console output and java code:
Console:
>>telnet localhost 6969
Trying 127.0.0.1...
Connected to localhost.
Escape character is '^]'.
>>3
Connection closed by foreign host.
Java Console:
Starting server on 6969
channelConnected
channelDisconnected
java.lang.IllegalArgumentException: unsupported message type: class java.lang.String
at org.jboss.netty.channel.socket.nio.SocketSendBufferPool.acquire(SocketSendBufferPool.java:51)
at org.jboss.netty.channel.socket.nio.NioWorker.write0(NioWorker.java:455)
...
Server.java
public class Server {
public static void main(String[] args) throws Exception {
ChannelFactory factory =
new NioServerSocketChannelFactory(
Executors.newCachedThreadPool(),
Executors.newCachedThreadPool());
ServerBootstrap bootstrap = new ServerBootstrap(factory);
Timer timer = new HashedWheelTimer();
bootstrap.setPipelineFactory(new MyPipelineFactory(timer) {
public ChannelPipeline getPipeline() {
return Channels.pipeline(new ServerHandler());
}
});
bootstrap.setOption("child.tcpNoDelay", true);
bootstrap.setOption("child.keepAlive", true);
bootstrap.bind(new InetSocketAddress(6969));
System.out.println("Starting server on 6969");
}
}
ServerHandler.java
public class ServerHandler extends SimpleChannelHandler {
#Override
public void channelConnected(ChannelHandlerContext ctx, ChannelStateEvent e){
Channel ch = e.getChannel();
System.out.println("channelConnected");
}
#Override
public void channelDisconnected(ChannelHandlerContext ctx, ChannelStateEvent e){
Channel ch = e.getChannel();
System.out.println("channelDisconnected");
}
#Override
public void messageReceived(ChannelHandlerContext ctx, MessageEvent e) {
String msg = (String) e.getMessage();
e.getChannel().write("Did you say '" + msg + "'?\n");
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, ExceptionEvent e) {
e.getCause().printStackTrace();
Channel ch = e.getChannel();
ch.close();
}
}
MyPipelineFactory.java
public class MyPipelineFactory implements ChannelPipelineFactory {
private final Timer timer;
private static ChannelHandler idleStateHandler;
public MyPipelineFactory(Timer t) {
this.timer = t;
//this.idleStateHandler = new IdleStateHandler(timer, 5, 20, 0); // timer must be shared
}
public ChannelPipeline getPipeline() {
// create default pipeline from static method
ChannelPipeline pipeline = Channels.pipeline();
// Decoders
int maxFrameLength = 1024;
pipeline.addLast("framer", new DelimiterBasedFrameDecoder(maxFrameLength, Delimiters.lineDelimiter()));
//pipeline.addLast("frameDecoder", new LengthFieldBasedFrameDecoder(maxFrameLength,0,1)); // get header from message
pipeline.addLast("stringDecoder", new StringDecoder(CharsetUtil.UTF_8));
// Encoders
pipeline.addLast("stringEncoder", new StringEncoder(CharsetUtil.UTF_8));
// Idle state handling- heartbeat
//pipeline.addLast("idleStateHandler", idleStateHandler);
return pipeline;
}
}
Bonus, because I'm stupid and want to get in over my head...
HeartbeatHandler.java
public class HeartbeatHandler extends IdleStateAwareChannelHandler {
#Override
public void channelIdle(ChannelHandlerContext ctx, IdleStateEvent e) {
if (e.getState() == IdleState.READER_IDLE) {
System.out.println("Reader idle, closing channel");
e.getChannel().close();
}
else if (e.getState() == IdleState.WRITER_IDLE) {
System.out.println("Writer idle, sending heartbeat");
e.getChannel().write("heartbeat"); //
}
}
}
It's because you mess up the ChannelPipeline.
You use:
bootstrap.setPipelineFactory(new MyPipelineFactory(timer) {
public ChannelPipeline getPipeline() {
return Channels.pipeline(new ServerHandler());
}
});
What you would need todo is modify the MyPipelineFactory class and add your ServerHandler in there. Then just set it like:
bootstrap.setPipelineFactory(new MyPipelineFactory(timer));
Then everything should work. Even your timeout stuff ;)

Categories

Resources