Migrating sendUpstream in Netty 4 - java

I'm migrating from netty 3 to netty 4. I have a pipeline handler that acts as a classic filter, intercepting/handling noncompliant messages on the way, and shoveling compliant ones upstream.
Based on the documentation (http://netty.io/wiki/new-and-noteworthy.html), I expected to use ctx.fireInboundBufferUpdated() in lieu of ctx.sendUpStream() to relay inbound. However, I've found this doesn't work, but ChannelHandlerUtil.addToNextInboundBuffer() does. I'd love some guidance as to:
My confusion over the current docs assertion that ctx.sendUpstream -> ctx.fireInboundBufferUpdated and,
What is the best practice in this case, if different than what I've done below.
The code:
//The pipeline
public class ServerInitializer extends ChannelInitializer<SocketChannel> {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
p.addLast("decoder", new HttpRequestDecoder());
p.addLast("encoder", new HttpResponseEncoder());
p.addLast("inbound", InboundHttpRequestFilter.INSTANCE);
p.addLast("handler", handlerClass.newInstance());
}
}
//The filter
public class InboundHttpRequestFilter extends
ChannelInboundMessageHandlerAdapter<Object> {
#Override
public void messageReceived(ChannelHandlerContext ctx, Object msg)
throws Exception {
... discard/handle as necessary …;
//ctx.fireInboundBufferUpdated(); - doesn't propagate upstream
ChannelHandlerUtil.addToNextInboundBuffer(ctx, msg); // sends upstream
}
}

Try this :
ctx.nextInboundMessageBuffer().add(msg)
Javadoc :
Interface ChannelHandlerContext
MessageBuf<Object> nextInboundMessageBuffer()
Return the MessageBuf of the next ChannelInboundMessageHandler in the pipeline.
Netty 4 Multiple Handler Example :
MultiHandlerServer.java
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import io.netty.handler.codec.LineBasedFrameDecoder;
import io.netty.handler.codec.string.StringDecoder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.nio.charset.Charset;
public class MultiHandlerServer {
private static final Logger logger = LoggerFactory.getLogger(MultiHandlerServer.class);
final int port;
public MultiHandlerServer(final int port) {
this.port = port;
}
public void run() throws InterruptedException {
final NioEventLoopGroup bossGroup = new NioEventLoopGroup();
final NioEventLoopGroup workerGroup = new NioEventLoopGroup();
try {
final ServerBootstrap serverBootstrap = new ServerBootstrap()
.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
protected void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(
new LineBasedFrameDecoder(8192),
new StringDecoder(Charset.forName("UTF-8")),
new MultiHandler01(), new MultiHandler02());
}
});
final ChannelFuture future = serverBootstrap.bind(port).sync();
future.channel().closeFuture().sync();
} finally {
bossGroup.shutdownGracefully();
workerGroup.shutdownGracefully();
}
}
public static void main(String[] args) throws InterruptedException {
final MultiHandlerServer client = new MultiHandlerServer(8080);
client.run();
}
}
MultiHandler01.java
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundMessageHandlerAdapter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
*/
class MultiHandler01 extends ChannelInboundMessageHandlerAdapter<String> {
private Logger logger = LoggerFactory.getLogger(MultiHandler01.class);
MultiHandler01() {
}
#Override
public void messageReceived(ChannelHandlerContext ctx, String msg) throws Exception {
logger.info(String.format("Handler01 receive message: %s", msg));
ctx.nextInboundMessageBuffer().add(msg);
ctx.fireInboundBufferUpdated();
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) throws Exception {
logger.error("Exception caught: %s", ctx.channel().remoteAddress(), cause);
ctx.close();
}
}
MultiHandler02.java
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundMessageHandlerAdapter;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
/**
*/
class MultiHandler02 extends ChannelInboundMessageHandlerAdapter<String> {
private Logger logger = LoggerFactory.getLogger(MultiHandler02.class);
MultiHandler02() {
}
#Override
public void messageReceived(ChannelHandlerContext ctx, String msg) throws Exception {
logger.info(String.format("Handler02 receive message: %s", msg));
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) throws Exception {
logger.error("Exception caught: %s", ctx.channel().remoteAddress(), cause);
ctx.close();
}
}

Related

Netty Pipeline being executed out of order

I am trying to figure out why my pipeline is being executed out of order.
I took the HexDumpProxy example and was trying to turn it into a http-proxy where I can look at all the traffic. For some reason the code is being executed backwards and I can't figure out why.
My server listens on 8443 and takes in the http content. I wanted to read the host header and create a frontend handler to route the data to the server, but my frontend handler executes first despite being last in the pipeline. I am unsure why it is running first I thought it would be execute in the following order.
LoggingHandler
HttpRequestDecoder
HttpObjectAggregator
HttpProxyListener
HttpReEncoder
HTTPProxyFrontEnd
The goal is to remove frontendhandler from the pipeline and have the HTTPProxy listener add it to the pipeline after reading the host header. but if I remove the frontend handler no data is transferred. Using breakpoints HTTPProxyFrontEnd is hit before HttpProxyListener. I am unsure why it is being executed so out of order.
Main
```
EventLoopGroup bossGroup = new NioEventLoopGroup(1);
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.handler(new LoggingHandler(LogLevel.INFO))
.childHandler(new HttpProxyServerInitializer(REMOTE_HOST, REMOTE_PORT))
.childOption(ChannelOption.AUTO_READ, false)
.bind(LOCAL_PORT).sync().channel().closeFuture().sync();
} finally {
bossGroup.shutdownGracefully();
workerGroup.shutdownGracefully();
}
```
Pipeline
```
import io.netty.buffer.ByteBuf;
import io.netty.channel.Channel;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelPipeline;
import io.netty.handler.codec.MessageToByteEncoder;
import io.netty.handler.codec.http.*;
import io.netty.handler.logging.LogLevel;
import io.netty.handler.logging.LoggingHandler;
import io.netty.handler.ssl.SslContext;
import io.netty.handler.ssl.SslContextBuilder;
import io.netty.handler.ssl.SslHandler;
import io.netty.handler.ssl.util.SelfSignedCertificate;
import javax.net.ssl.SSLEngine;
public class HttpProxyServerInitializer extends ChannelInitializer {
private final String remoteHost;
private final int remotePort;
public HttpProxyServerInitializer(String remoteHost, int remotePort) {
this.remoteHost = remoteHost;
this.remotePort = remotePort;
}
#Override
protected void initChannel(Channel ch) throws Exception {
ch.pipeline().addLast(
new LoggingHandler(LogLevel.INFO),
new HttpRequestDecoder(),
new HttpObjectAggregator(8192),
new HttpProxyListener(),
new HttpReEncoder(),
new HTTPProxyFrontEnd(remoteHost, remotePort));
}
}
```
Proxy Front end
```
import io.netty.bootstrap.Bootstrap;
import io.netty.buffer.ByteBuf;
import io.netty.buffer.Unpooled;
import io.netty.channel.*;
import io.netty.channel.embedded.EmbeddedChannel;
import io.netty.handler.codec.DecoderResult;
import io.netty.handler.codec.http.*;
import io.netty.handler.codec.http.cookie.ServerCookieDecoder;
import io.netty.handler.codec.http.cookie.ServerCookieEncoder;
import io.netty.util.CharsetUtil;
import java.net.SocketAddress;
import java.util.List;
import java.util.Map;
import java.util.Set;
import static io.netty.handler.codec.http.HttpResponseStatus.BAD_REQUEST;
import static io.netty.handler.codec.http.HttpResponseStatus.OK;
import static io.netty.handler.codec.http.HttpVersion.HTTP_1_1;
public class HTTPProxyFrontEnd extends ChannelInboundHandlerAdapter {
private final String remoteHost;
private final int remotePort;
private final StringBuilder buf = new StringBuilder();
private HttpRequest request;
// As we use inboundChannel.eventLoop() when building the Bootstrap this does not need to be volatile as
// the outboundChannel will use the same EventLoop (and therefore Thread) as the inboundChannel.
private Channel outboundChannel;
public HTTPProxyFrontEnd(String remoteHost, int remotePort) {
this.remoteHost = remoteHost;
this.remotePort = remotePort;
}
#Override
public void channelActive(ChannelHandlerContext ctx) {
System.out.println("HTTPFrontEnd");
final Channel inboundChannel = ctx.channel();
// Start the connection attempt.
Bootstrap b = new Bootstrap();
b.group(inboundChannel.eventLoop())
.channel(ctx.channel().getClass())
.handler(new HexDumpProxyBackendHandler(inboundChannel))
.option(ChannelOption.AUTO_READ, false);
ChannelFuture f = b.connect(remoteHost, remotePort);
SocketAddress test = ctx.channel().remoteAddress();
outboundChannel = f.channel();
f.addListener(new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture future) {
if (future.isSuccess()) {
// connection complete start to read first data
inboundChannel.read();
} else {
// Close the connection if the connection attempt has failed.
inboundChannel.close();
}
}
});
}
#Override
public void channelRead(final ChannelHandlerContext ctx, Object msg) throws InterruptedException {
if (outboundChannel.isActive()) {
outboundChannel.writeAndFlush(msg).addListener(new ChannelFutureListener() {
#Override
public void operationComplete(ChannelFuture future) {
if (future.isSuccess()) {
// was able to flush out data, start to read the next chunk
ctx.channel().read();
} else {
future.channel().close();
}
}
});
}
}
private boolean writeResponse(HttpObject currentObj, ChannelHandlerContext ctx) {
// Decide whether to close the connection or not.
boolean keepAlive = HttpUtil.isKeepAlive(request);
// Build the response object.
FullHttpResponse response = new DefaultFullHttpResponse(
HTTP_1_1, currentObj.decoderResult().isSuccess()? OK : BAD_REQUEST,
Unpooled.copiedBuffer(buf.toString(), CharsetUtil.UTF_8));
response.headers().set(HttpHeaderNames.CONTENT_TYPE, "text/plain; charset=UTF-8");
if (keepAlive) {
// Add 'Content-Length' header only for a keep-alive connection.
response.headers().setInt(HttpHeaderNames.CONTENT_LENGTH, response.content().readableBytes());
// Add keep alive header as per:
// - http://www.w3.org/Protocols/HTTP/1.1/draft-ietf-http-v11-spec-01.html#Connection
response.headers().set(HttpHeaderNames.CONNECTION, HttpHeaderValues.KEEP_ALIVE);
}
// Encode the cookie.
String cookieString = request.headers().get(HttpHeaderNames.COOKIE);
if (cookieString != null) {
Set<io.netty.handler.codec.http.cookie.Cookie> cookies = ServerCookieDecoder.STRICT.decode(cookieString);
if (!cookies.isEmpty()) {
// Reset the cookies if necessary.
for (io.netty.handler.codec.http.cookie.Cookie cookie: cookies) {
response.headers().add(HttpHeaderNames.SET_COOKIE, io.netty.handler.codec.http.cookie.ServerCookieEncoder.STRICT.encode(cookie));
}
}
} else {
// Browser sent no cookie. Add some.
response.headers().add(HttpHeaderNames.SET_COOKIE, io.netty.handler.codec.http.cookie.ServerCookieEncoder.STRICT.encode("key1", "value1"));
response.headers().add(HttpHeaderNames.SET_COOKIE, ServerCookieEncoder.STRICT.encode("key2", "value2"));
}
// Write the response.
//ctx.writeAndFlush(response);
return keepAlive;
}
private static void appendDecoderResult(StringBuilder buf, HttpObject o) {
DecoderResult result = o.decoderResult();
if (result.isSuccess()) {
return;
}
buf.append(".. WITH DECODER FAILURE: ");
buf.append(result.cause());
buf.append("\r\n");
}
#Override
public void channelInactive(ChannelHandlerContext ctx) {
if (outboundChannel != null) {
closeOnFlush(outboundChannel);
}
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
cause.printStackTrace();
closeOnFlush(ctx.channel());
}
/**
* Closes the specified channel after all queued write requests are flushed.
*/
static void closeOnFlush(Channel ch) {
if (ch.isActive()) {
ch.writeAndFlush(Unpooled.EMPTY_BUFFER).addListener(ChannelFutureListener.CLOSE);
}
}
}
```

NullPointer exception while init of WebSocketClient

I am getting nullpointer exception on Connecting to a secured websocket.
PFB the exception:
Exception in thread "main" java.lang.NullPointerException
at org.eclipse.jetty.websocket.common.extensions.AbstractExtension.getName(AbstractExtension.java:90)
at org.eclipse.jetty.websocket.api.extensions.ExtensionFactory.<init>(ExtensionFactory.java:37)
at org.eclipse.jetty.websocket.common.extensions.WebSocketExtensionFactory.<init>(WebSocketExtensionFactory.java:40)
at org.eclipse.jetty.websocket.client.WebSocketClient.<init>(WebSocketClient.java:90)
at com.service.SecureClientSocket.main(SecureClientSocket.java:26)
PFB my code:
package com.service;
import java.net.URI;
import java.util.concurrent.Future;
import org.eclipse.jetty.util.ssl.SslContextFactory;
import org.eclipse.jetty.websocket.api.Session;
import org.eclipse.jetty.websocket.api.annotations.OnWebSocketClose;
import org.eclipse.jetty.websocket.api.annotations.OnWebSocketConnect;
import org.eclipse.jetty.websocket.api.annotations.OnWebSocketError;
import org.eclipse.jetty.websocket.api.annotations.OnWebSocketMessage;
import org.eclipse.jetty.websocket.api.annotations.WebSocket;
import org.eclipse.jetty.websocket.client.WebSocketClient;
#WebSocket
public class SecureClientSocket {
public static void main(String []args){
String url = "wss://qa.sockets.stackexchange.com/";
SslContextFactory sslContextFactory = new SslContextFactory();
sslContextFactory.setTrustAll(true); // The magic
WebSocketClient client = new WebSocketClient(sslContextFactory);
try
{
client.start();
SecureClientSocket socket = new SecureClientSocket();
Future<Session> fut = client.connect(socket,URI.create(url));
Session session = fut.get();
session.getRemote().sendString("Hello");
session.getRemote().sendString("155-questions-active");
}
catch (Throwable t)
{
System.out.println(t.getMessage());
}
}
#OnWebSocketConnect
public void onConnect(Session sess)
{
System.out.println("onConnect({})"+sess);
}
#OnWebSocketClose
public void onClose(int statusCode, String reason)
{
System.out.println("onClose({}, {})"+ statusCode+ reason);
}
#OnWebSocketError
public void onError(Throwable cause)
{
System.out.println(cause);
}
#OnWebSocketMessage
public void onMessage(String msg)
{
System.out.println("onMessage() - {}"+ msg);
}
}

how to create large number of connects with netty 5.0

I want to create large number client connection to the server for the test purpose. I accomplish this by creating thread per connection, so I can only create 3000 connection on my machine. below is my code:
package com.stepnetwork.iot.apsclient.application;
import io.netty.bootstrap.Bootstrap;
import io.netty.buffer.ByteBuf;
import io.netty.buffer.Unpooled;
import io.netty.channel.*;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioSocketChannel;
/**
* Created by sam on 3/22/16.
*/
public class DtuClient extends Thread {
private static final String HOST = "192.168.54.36";
private static final int PORT = 30080;
private EventLoopGroup workerGroup;
private String dtuCode;
public DtuClient(String dtuCode, EventLoopGroup workerGroup) {
this.dtuCode = dtuCode;
this.workerGroup = workerGroup;
}
public void run() {
Bootstrap bootstrap = new Bootstrap(); // (1)
try {
bootstrap.group(workerGroup); // (2)
bootstrap.channel(NioSocketChannel.class); // (3)
bootstrap.option(ChannelOption.SO_KEEPALIVE, true); // (4)
bootstrap.handler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(new MyClientHandler());
}
});
ChannelFuture feature = bootstrap.connect(HOST, PORT).sync();
feature.addListener((future) -> {
System.out.println(dtuCode + " connected to server");
Channel channel = feature.channel();
ByteBuf buffer = Unpooled.buffer(256);
buffer.writeBytes(dtuCode.getBytes());
channel.writeAndFlush(buffer);
});
feature.channel().closeFuture().sync();
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("completed");
}
}
Can I get more connection.
I had try another solution after doing google research, but the channel will close automatically.
here is my another solution
package com.stepnetwork.iot.apsclient.application;
import io.netty.bootstrap.Bootstrap;
import io.netty.channel.Channel;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioSocketChannel;
import java.util.ArrayList;
import java.util.List;
/**
* Created by sam on 3/22/16.
*/
public class Test {
private static final String HOST = "192.168.54.36";
private static final int PORT = 30080;
public static void main(String[] args) throws InterruptedException {
EventLoopGroup workerGroup = new NioEventLoopGroup();
Bootstrap bootstrap = new Bootstrap(); // (1)
try {
bootstrap.group(workerGroup); // (2)
bootstrap.channel(NioSocketChannel.class); // (3)
bootstrap.option(ChannelOption.SO_KEEPALIVE, true); // (4)
bootstrap.handler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ch.pipeline().addLast(new MyClientHandler());
}
});
List<Channel> channels = new ArrayList<>();
// create many connection here, but the channel will be closed atomically
for (int i = 0; i < 10000; i++) {
channels.add(bootstrap.connect(HOST, PORT).sync().channel());
}
} catch (InterruptedException e) {
e.printStackTrace();
}
while (true) {
Thread.sleep(Integer.MAX_VALUE);
}
}
}

netty error: InvalidClassException: failed to read class descriptor

What's the correct idiom to send, receive and cast objects between client and server? I'm starting from a simple example, I just want to send specific type between the client and server.
server output:
BUILD SUCCESSFUL
Total time: 3 seconds
Jul 26, 2014 3:36:22 AM io.netty.handler.logging.LoggingHandler channelRegistered
INFO: [id: 0xefbb5b05] REGISTERED
Jul 26, 2014 3:36:22 AM io.netty.handler.logging.LoggingHandler bind
INFO: [id: 0xefbb5b05] BIND(0.0.0.0/0.0.0.0:4454)
Jul 26, 2014 3:36:22 AM io.netty.handler.logging.LoggingHandler channelActive
INFO: [id: 0xefbb5b05, /0:0:0:0:0:0:0:0:4454] ACTIVE
Jul 26, 2014 3:36:32 AM io.netty.handler.logging.LoggingHandler logMessage
INFO: [id: 0xefbb5b05, /0:0:0:0:0:0:0:0:4454] RECEIVED: [id: 0xabdeec06, /127.0.0.1:59934 => /127.0.0.1:4454]
Jul 26, 2014 3:36:32 AM net.bounceme.dur.netty.ServerHandler exceptionCaught
SEVERE: io.netty.handler.codec.DecoderException: java.io.InvalidClassException: failed to read class descriptor
Jul 26, 2014 3:36:32 AM net.bounceme.dur.netty.ServerHandler channelReadComplete
INFO: finished reading..?
^Cthufir#dur:~/NetBeansProjects/AgentServer$
thufir#dur:~/NetBeansProjects/AgentServer$
client output:
BUILD SUCCESSFUL
Total time: 3 seconds
Jul 26, 2014 3:36:32 AM net.bounceme.dur.client.netty.ClientHandler channelActive
INFO:
id 0
phone 0
title null
state undefined
thufir#dur:~/NetBeansProjects/AgentClient$
ServerHandler:
package net.bounceme.dur.netty;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
import java.util.logging.Logger;
import net.bounceme.dur.jdbc.Title;
/**
* Handles both client-side and server-side handler depending on which
* constructor was called.
*/
public class ServerHandler extends ChannelInboundHandlerAdapter {
private static final Logger log = Logger.getLogger(ServerHandler.class.getName());
private RecordsQueueWrapper q = null;
public ServerHandler(RecordsQueueWrapper q) {
this.q = q;
}
#Override
public void channelRead(ChannelHandlerContext ctx, Object obj) {
Title receivedTitle = (Title) obj;
log.info(receivedTitle.toString());
Title nextTitle = q.pop();
ctx.write(nextTitle);
log.info("..channelRead");
}
#Override
public void channelReadComplete(ChannelHandlerContext ctx) {
log.info("finished reading..?");
ctx.flush();
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) {
log.severe(cause.toString());
ctx.close();
}
}
ClientHandler:
package net.bounceme.dur.client.netty;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
import java.util.logging.Logger;
import net.bounceme.dur.client.jdbc.Title;
public class ClientHandler extends ChannelInboundHandlerAdapter {
private static final Logger log = Logger.getLogger(ClientHandler.class.getName());
public ClientHandler() {
}
#Override
public void channelActive(ChannelHandlerContext ctx) {
Title firstTitle = new Title();
log.info(firstTitle.toString());
ctx.writeAndFlush(firstTitle);
}
#Override
public void channelRead(ChannelHandlerContext ctx, Object msg) {
try {
Title t = (Title) msg;
log.info(msg.toString());
ctx.write(t);
} catch (ClassCastException cce) { //????
log.warning(cce.toString());
}
}
#Override
public void channelReadComplete(ChannelHandlerContext ctx
) {
ctx.flush();
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause
) {
log.severe(cause.toString());
ctx.close();
}
}
Server:
package net.bounceme.dur.netty;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelPipeline;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioServerSocketChannel;
import io.netty.handler.codec.serialization.ClassResolvers;
import io.netty.handler.codec.serialization.ObjectDecoder;
import io.netty.handler.codec.serialization.ObjectEncoder;
import io.netty.handler.logging.LogLevel;
import io.netty.handler.logging.LoggingHandler;
import io.netty.handler.ssl.SslContext;
import io.netty.handler.ssl.util.SelfSignedCertificate;
import java.security.cert.CertificateException;
import java.util.logging.Logger;
import javax.net.ssl.SSLException;
import net.bounceme.dur.jdbc.Title;
public final class Server {
private static final Logger log = Logger.getLogger(Server.class.getName());
public static void main(String[] args) throws Exception {
MyProps p = new MyProps();
int port = p.getServerPort();
RecordsQueueWrapper q = new RecordsQueueWrapper();
q.init(99);
new Server().startServer(port, false, q);
}
private void startServer(int port, boolean ssl, final RecordsQueueWrapper q) throws CertificateException, SSLException, InterruptedException {
final SslContext sslCtx;
if (ssl) {
SelfSignedCertificate ssc = new SelfSignedCertificate();
sslCtx = SslContext.newServerContext(ssc.certificate(), ssc.privateKey());
} else {
sslCtx = null;
}
EventLoopGroup bossGroup = new NioEventLoopGroup(1);
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap b = new ServerBootstrap();
b.group(bossGroup, workerGroup)
.channel(NioServerSocketChannel.class)
.handler(new LoggingHandler(LogLevel.INFO))
.childHandler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
if (sslCtx != null) {
p.addLast(sslCtx.newHandler(ch.alloc()));
}
p.addLast(
new ObjectEncoder(),
new ObjectDecoder(ClassResolvers.weakCachingConcurrentResolver(Title.class.getClassLoader())),
//new ObjectDecoder(ClassResolvers.cacheDisabled(null)),
new ServerHandler(q)
);
}
});
b.bind(port).sync().channel().closeFuture().sync();
} finally {
bossGroup.shutdownGracefully();
workerGroup.shutdownGracefully();
}
}
}
Client:
package net.bounceme.dur.client.netty;
import io.netty.bootstrap.Bootstrap;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelPipeline;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioSocketChannel;
import io.netty.handler.codec.serialization.ClassResolvers;
import io.netty.handler.codec.serialization.ObjectDecoder;
import io.netty.handler.codec.serialization.ObjectEncoder;
import io.netty.handler.ssl.SslContext;
import io.netty.handler.ssl.util.InsecureTrustManagerFactory;
import java.util.logging.Logger;
import javax.net.ssl.SSLException;
import net.bounceme.dur.client.jdbc.Title;
/**
* Modification of {#link EchoClient} which utilizes Java object serialization.
*/
public final class Client {
private static final Logger log = Logger.getLogger(Client.class.getName());
static final int SIZE = Integer.parseInt(System.getProperty("size", "256"));
public Client() {
}
public void init() throws InterruptedException, SSLException {
MyProps p = new MyProps();
String host = p.getHost();
int port = p.getServerPort();
startClient(host, port, false);
}
private void startClient(final String host, final int port, final boolean SSL) throws SSLException, InterruptedException {
// Configure SSL.
final SslContext sslCtx;
if (SSL) {
sslCtx = SslContext.newClientContext(InsecureTrustManagerFactory.INSTANCE);
} else {
sslCtx = null;
}
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioSocketChannel.class)
.handler(new ChannelInitializer<SocketChannel>() {
#Override
public void initChannel(SocketChannel ch) throws Exception {
ChannelPipeline p = ch.pipeline();
if (sslCtx != null) {
p.addLast(sslCtx.newHandler(ch.alloc(), host, port));
}
p.addLast(
new ObjectEncoder(),
//new ObjectDecoder(ClassResolvers.cacheDisabled(null)),
new ObjectDecoder(ClassResolvers.weakCachingConcurrentResolver(Title.class.getClassLoader())),
new ClientHandler());
}
});
// Start the connection attempt.
b.connect(host, port).sync().channel().closeFuture().sync();
} finally {
group.shutdownGracefully();
}
}
}
see also:
Strange Netty error whilst deserializing
Please check package of Title object, there are not must be different package name, in server and in client! For example: in server com.server.dto and in client com.server.dto!

Netty channelRead never called

I've played a bit with netty and followed a video(https://www.youtube.com/watch?v=tsz-assb1X8) to build a chat server and client the server works properly(I tested with telnet and here it works) but the client does not recives data. The channelRead method in ChatClinetHandler.java were never called but the channelReadComplete were called.
ChatClient.java
import io.netty.bootstrap.Bootstrap;
import io.netty.channel.Channel;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.nio.NioSocketChannel;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.net.InetSocketAddress;
import java.util.logging.Level;
import java.util.logging.Logger;
public class ChatClient {
public static void main(String args[]) {
try {
new ChatClient(new InetSocketAddress("localhost", 8000)).run();
} catch (Exception ex) {
Logger.getLogger(ChatClient.class.getName()).log(Level.SEVERE, null, ex);
}
}
private final InetSocketAddress server;
public ChatClient(InetSocketAddress server) {
this.server = server;
}
public void run() throws Exception {
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap bootstrap = new Bootstrap()
.group(group)
.channel(NioSocketChannel.class)
.handler(new ChatClientInitializer());
Channel channel = bootstrap.connect(server).sync().channel();
System.out.println("Connected to Server: " + server.toString());
BufferedReader in = new BufferedReader(new InputStreamReader(System.in));
while (channel.isActive()) {
String userMessage = in.readLine();
channel.writeAndFlush(userMessage + "\r\n");
if (userMessage.equalsIgnoreCase("bye")) {
group.shutdownGracefully();
break;
}
}
} finally {
group.shutdownGracefully();
}
}
}
ChatClientInitializer.java
import io.netty.channel.ChannelInitializer; import io.netty.channel.ChannelPipeline; import io.netty.channel.socket.SocketChannel; import io.netty.handler.codec.DelimiterBasedFrameDecoder; import io.netty.handler.codec.Delimiters; import io.netty.handler.codec.string.StringDecoder; import io.netty.handler.codec.string.StringEncoder;
public class ChatClientInitializer extends ChannelInitializer<SocketChannel> {
#Override
protected void initChannel(SocketChannel c) throws Exception {
ChannelPipeline pipeline = c.pipeline();
pipeline.addLast(new DelimiterBasedFrameDecoder(8192, Delimiters.lineDelimiter()));
pipeline.addLast(new StringDecoder());
pipeline.addLast(new StringEncoder());
pipeline.addLast(new ChatClientHandler());
} }
ChatClinetHandler.java
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.ChannelInboundHandlerAdapter;
public class ChatClientHandler extends ChannelInboundHandlerAdapter {
#Override
public void channelRead(ChannelHandlerContext ctx, Object msg) throws Exception {
System.out.println(msg.toString());
}
#Override
public void exceptionCaught(ChannelHandlerContext ctx,
Throwable cause) {
cause.printStackTrace();
}
}
Sorry my mistake i forgot to send "\r\n" with each message.
Although shim_ claims to have determined that the problem is not sending "\r\n" with each message, I see the need to provide a clarification here.
In ChatClientInitializer.java you see a handler named framer which is a DelimiterBasedFrameDecoder object with a line delimiter in that specific example (More info on DelimiterBasedFrameDecoder can be found here).
So, that means the client is expecting messages that ends with "\r\n" on that specific channel. If a message not ending with "\r\n" is received, then the channelRead() method will not be provoked.

Categories

Resources