I'm having a difficult time understanding what is going wrong in my application. Below shows code that successfully executes on first pass (a prior HttpResponse, and then on its second call, hangs at the dataInputStream.readFully(...) line.
doSomething(HttpResponse response) {
HttpEntity responseBody = response.getEntity();
long len = responseBody.getContentLength();
byte[] payload = new byte[(int)len]; // <-- I've confirmed this is the correct length
DataInputStream d = null;
try {
InputStream bais = responseBody.getContent();
d = new DataInputStream(bais);
d.readFully(payload); // <-- *** HANGS HERE! ***
EntityUtils.consume(responseBody);
...
} finally {
if (d != null) {
IOUtils.closeQuietly(d);
}
}
}
After blocking/hanging for 10+ seconds, the application times out and the calling thread is torn down.
I've inspected the response object, and confirmed its Content-Length to be what is expected.
Nothing jumped out at me reading the DataInputStream Javadoc.
Going through our commit history, I noticed that the IOUtils.closeQuietly(...) call was introduced. [Trust me here:] It's been hard to track our integration system's initial failure of this application, so I cannot confirm whether this finally block introduced any unintended behavior. From what I can tell, calling closeQuietly() there is the recommended approach.
Update: For clarity, this code is part of an application performing HTTP RANGED GET requests. E.g., first response is pulling down N bytes, starting at byte 0. The second call GETs the next N bytes. The file is large, so there are more than two necessary RANGED GETs.
Related
I have HBase code that I use for gets (Although I don't have Kerberos on, I plan to have it later so I wanted to make sure that user credentials were handled correctly when connecting and doing a Put or Get).
final ByteArrayOutputStream bos = new ByteArrayOutputStream();
MyHBaseService.getUserHBase().runAs(new PrivilegedExceptionAction<Object>() {
#Override
public Object run() throws Exception {
Connection connection = null;
Table StorageTable = null;
List<hFile> HbaseDownload = new ArrayList<>();
try {
// Open an HBase Connection
connection = ConnectionFactory.createConnection(MyHBaseService.getHBaseConfiguration());
Get get = new Get(Bytes.toBytes("filenameCell"));
Result result = table.get(get);
byte[] data = result.getValue(Bytes.toBytes(MyHBaseService.getDataStoreFamily()), Bytes.toBytes(MyHBaseService.getDataStoreQualifier()));
bos.write(data, 0, data.length);
bos.flush();
...
}
});
// now get the outputstream.
// I am assuming byteArrayStream is synchronized and thread-safe.
return bos.toByteArray();
However, I wasn't sure if this was running an asynchronous or synchronous thread.
The problem:
I use:
Get get = new Get(Bytes.toBytes("filenameCell"));
Result result = table.get(get);
Inside this run() function. But to get information OUT of the run() thread I use a new ByteOutputArrayStream OUTSIDE the run(). ByteOutputArrayStream.write & ByteOutputArrayStream.flush inside the run(). Then toByteArray() to get the binary bytes of the HBase content out of the function. This causes null bytes to be returned though, so maybe I'm not doing this right.
However, I am having difficulty finding good examples of HBase Java API to do these things and no one seems to use runAs like I do. It's so strange.
I have HBase 1.2.5 client running inside a Web App (request-based function calls).
Here in this code the thread is running inside "MyHBaseService.getUserHBase().runAs" this. But if it is running asyncronously then before executing it properly program will return "bos.toByteArray();" as this is outside the runAs(). So before even executing the complete function it will return the output.
I think thats the reason of null values.
I've been trying to get json streaming to work in jersey 2. For the life of me nothing streams until the stream is complete.
I've tried this example trying to simulate a slow producer of data.
#Path("/foo")
#GET
public void getAsyncStream(#Suspended AsyncResponse response) {
StreamingOutput streamingOutput = output -> {
JsonGenerator jg = new ObjectMapper().getFactory().createGenerator(output, JsonEncoding.UTF8);
jg.writeStartArray();
for (int i = 0; i < 100; i++) {
jg.writeObject(i);
try {
Thread.sleep(100);
}
catch (InterruptedException e) {
logger.error(e, "Error");
}
}
jg.writeEndArray();
jg.flush();
jg.close();
};
response.resume(Response.ok(streamingOutput).build());
}
And yet jersey just sits there until the json generator is done to return the results. I'm watching the results come through in charles proxy.
Do I need to enable something? Not sure why this won't stream out
Edit:
This may actually be working, just not how I expected it. I dont' think stream is writing things realtime which is what I wanted, its more for not having to buffer responses and immediately write them out to the client. If I run a loop of a million and no thread sleep then data does get written out in chunks without having to buffer it in memory.
Your edit it correct. It is working as expected. StreamingOutput is just a wrapper that let's us write directly to the response stream, but does not actually mean the response is streamed on each server side write to the stream. Also AsyncResponse does not provide any different response as far as the client is concerned. It is simply to help increase throughput with long running tasks. The long running task should actually be done in another thread, so the method can return.
See more at Asynchronous Server API
What you seem to be looking for instead is Chunked Output
Jersey offers a facility for sending response to the client in multiple more-or-less independent chunks using a chunked output. Each response chunk usually takes some (longer) time to prepare before sending it to the client. The most important fact about response chunks is that you want to send them to the client immediately as they become available without waiting for the remaining chunks to become available too.
Not sure how it will work for your particular use case, as the JsonGenerator expects an OutputStream (of which the ChuckedOutput we use is not), but here is a simpler example
#Path("async")
public class AsyncResource {
#GET
public ChunkedOutput<String> getChunkedStream() throws Exception {
final ChunkedOutput<String> output = new ChunkedOutput<>(String.class);
new Thread(() -> {
try {
String chunk = "Message";
for (int i = 0; i < 10; i++) {
output.write(chunk + "#" + i);
Thread.sleep(1000);
}
} catch (Exception e) {
} finally {
try {
output.close();
} catch (IOException ex) {
Logger.getLogger(AsyncResource.class.getName())
.log(Level.SEVERE, null, ex);
}
}
}).start();
return output;
}
}
Note: I had a problem getting this to work at first. I would only get the delayed complete result. The problem seemed to have been with something completely separate from the program. It was actually my AVG causing the problem. Some feature called "LinkScanner" was stopping this chunking process to occur. I disabled that feature and it started working.
I haven't explored chunking much, and am not sure the security implications, so I am not sure why the AVG application has a problem with it.
EDIT
Seems the real problem is due to Jersey buffering the response in order to calculate the Content-Length header. You can see this post for how you can change this behavior
I am writing a Java HTTP server. I thought the entire server was working and it is using threading. However, I'm realizing that the piece of code that reads the request into a BufferedReader is not working consistently.
Here is the code that reads an incoming request:
private String receive(WebSocket webSocket) throws IOException {
int chr;
System.out.println("Receiving!");
StringBuffer buffer = new StringBuffer();
while ( (chr = webSocket.in().read() ) != -1) {
buffer.append((char) chr);
if ( !webSocket.in().ready())
break;
}
return buffer.toString();
}
My Websocket class just wraps the Socket and provides an in and an out. I did this so that I could mock out the socket and test my server.
The Websocket class looks like this:
package http.server.socket;
import java.io.*;
import java.net.Socket;
public class SystemSocket implements WebSocket {
private Socket theConnection;
private BufferedReader in;
private OutputStream out;
public SystemSocket(Socket theConnection) throws IOException {
this.theConnection = theConnection;
in = new BufferedReader(new InputStreamReader(theConnection.getInputStream()));
out = new BufferedOutputStream(theConnection.getOutputStream());
}
public BufferedReader in() throws IOException {
return in;
}
public OutputStream out() throws IOException {
return out;
}
public void close() throws IOException {
in.close();
out.close();
theConnection.close();
}
}
The problem is that with each url the user enters in a browser, two requests are made - one for the page requested and one for the favicon. Sometimes - it seems - the favicon request is not coming in and the thread hangs.
Here's some debugging information I have printing to the console when things go right:
Receiving!
Receiving!
REQUEST STRING = GET /color_picker.html HT
[20130821 20:29:23] REQUEST: http://localhost:5000/color_picker.html
[20130821 20:29:23] PAGE RENDERED
REQUEST STRING = GET /favicon.ico HTTP/1.1
[20130821 20:29:23] REQUEST: http://localhost:5000/favicon.ico
[20130821 20:29:23] PAGE RENDERED
The "Receiving" message is getting printed whenever the request is getting read. So, in this case, the "Receiving" message got printed twice, two requests came in and two things were rendered. But then, the same page (but at a different time) will do this (after about 10 seconds):
Receiving!
Receiving!
REQUEST STRING = GET /color_picker.html HTTP/1.1
[20130821 20:41:25] REQUEST: http://localhost:5000/color_picker.html
[20130821 20:41:25] PAGE RENDERED
REQUEST STRING =
Exception in thread "ServerThread" java.lang.ArrayIndexOutOfBoundsException: 1
at http.request.Parser.setRequestLineData(Parser.java:42)
at http.request.Parser.setRequestHash(Parser.java:27)
at http.request.Parser.parse(Parser.java:13)
at http.request.Request.get(Request.java:18)
at http.server.ServerThread.run(ServerThread.java:39)
All the subsequent errors are because the request string is null. But I can't figure out why the Request string is null. I can't even figure out how to debug.
Can anyone help??
Also important to note that if the second request string doesn't come in right away, the user can request a new url and it will cause the second hung process to complete (so then the fourth request url will be what hangs). So, it's only when the user stops requesting things, on the last request after about 10 seconds, I will get the error. Sometimes I can request 20 different pages and it's only after I stop requesting pages and wait a few seconds, that I will see an error. I think this is what is happening??
UPDATE:
Per the request, here is the setRequestLineData() method:
private void setRequestLineData() {
requestHash = new HashMap<String, String>();
if (requestLineParts.length == 3) {
requestHash.put("httpMethod", requestLineParts[0]);
requestHash.put("url", requestLineParts[1]); //line 42
requestHash.put("httpProtocol", requestLineParts[2]);
}
else {
requestHash.put("httpMethod", requestLineParts[0]);
requestHash.put("url", requestLineParts[1]);
requestHash.put("queryString", requestLineParts[2]);
requestHash.put("httpProtocol", requestLineParts[3]);
}
}
UPDATE:
I think I figured out more about what is going on here with my mentor's help. His thought is that once a request is received, the browser starts another request right away to reduce load time for the next request. This sound plausible to me since I can load page after page after page, but it's only about 10 seconds after the last page is requested that I get an error. Currently, I'm handling this with a custom exception, but am working on a better solution. Thanks for all the help guys!
ready() isn't a valid test for end of message. It only tells you whether there is data available to be read without blocking. TCP isn't a message-oriented protocol, it is a byte-stream protocol. If you want messages you must implement them yourself, e.g. as lines, length-value tuples, type-length-value tuples, serialized objects, XML documents, ...
There are few if any correct uses of ready() (or available()), and this isn't one of them.
I am a beginner in Java and I am working at a project involving sockets and external devices.
I have done a small listener for the units and I am having some problems sending messages to the devices.
The unit has 2 messages that is sending to the server:
1. the Heart Beat message witch is in hex
2. the actual data which is a string and looks like: field1,field2,field3,field4
The listener is opening a new thread for each device and is wait for messages.
I use an DataInputStream for receiving the messages and DataOutputStream for sending the messages. Each time a unit is connecting to the server, it opens a new thread and gets the heart beat, from that heart beat I know what type of device is and who is it. If the device is new I will configure it based on 2 unique serials which the unit knows to send me and I have to compare them to the database.
My problem is in the part in which I ask the device for the 2 serials. It takes a long time 30 to 40 seconds. It has to be under 10 seconds for send the command (tested on another tcp server).
Here is that part of code:
String [] commands = deviceDetails.getConfigCommands().split(";"); // I take the commands from a file, and looks like: command1;command2
for (int i=0; i<commands.length; i++) {
sendCommand = commands[i].getBytes(); // sendCommand is a byte[]
out.write(sendCommand);
out.flush();
loop: while (true) {
in.read(buffer); // buffer is a byte[]
String[] result = new String(buffer, "ASCII").split(":|="); // the response from the unit looks like: ok:commandname=code, i need the code
if (configCounter ==2 && imei.equals("")) {
i = 0;
break loop;
} else if(configCounter == 2 && simid.equals("")) {
i = 1;
break loop;
}
configCounter++;
switch (result[1]) {
case "IMEI":
imei = result[2];
break loop;
case "SIMID":
simid = result[2];
break loop;
}
}
}
in the while I am waiting for the response from the unit. but before that I am trying to send the command to the unit and it takes about 30-40 second for each command.
Can anyone tell me what's the problem in the code above? How can I make the DataOutputStream send the command faster than 30-40 seconds??
Thank you!
I am developing an interface that takes as input an encrypted byte stream -- probably a very large one -- that generates output of more or less the same format.
The input format is this:
{N byte envelope}
- encryption key IDs &c.
{X byte encrypted body}
The output format is the same.
Here's the usual use case (heavily pseudocoded, of course):
Message incomingMessage = new Message (inputStream);
ProcessingResults results = process (incomingMessage);
MessageEnvelope messageEnvelope = new MessageEnvelope ();
// set message encryption options &c. ...
Message outgoingMessage = new Message ();
outgoingMessage.setEnvelope (messageEnvelope);
writeProcessingResults (results, message);
message.writeToOutput (outputStream);
To me, it seems to make sense to use the same object to encapsulate this behaviour, but I'm at a bit of a loss as to how I should go about this. It isn't practical to load all of the encrypted body in at a time; I need to be able to stream it (so, I'll be using some kind of input stream filter to decrypt it) but at the same time I need to be able to write out new instances of this object. What's a good approach to making this work? What should Message look like internally?
I won't create one class to handle in- and output - one class, one responsibility. I would like two filter streams, one for input/decryption and one for output/encryption:
InputStream decrypted = new DecryptingStream(inputStream, decryptionParameters);
...
OutputStream encrypted = new EncryptingStream(outputSream, encryptionOptions);
They may have something like a lazy init mechanism reading the envelope before first read() call / writing the envelope before first write() call. You also use classes like Message or MessageEnvelope in the filter implementations, but they may stay package protected non API classes.
The processing will know nothing about de-/encryption just working on a stream. You may also use both streams for input and output at the same time during processing streaming the processing input and output.
Can you split the body at arbitrary locations?
If so, I would have two threads, input thread and output thread and have a concurrent queue of strings that the output thread monitors. Something like:
ConcurrentLinkedQueue<String> outputQueue = new ConcurrentLinkedQueue<String>();
...
private void readInput(Stream stream) {
String str;
while ((str = stream.readLine()) != null) {
outputQueue.put(processStream(str));
}
}
private String processStream(String input) {
// do something
return output;
}
private void writeOutput(Stream out) {
while (true) {
while (outputQueue.peek() == null) {
sleep(100);
}
String msg = outputQueue.poll();
out.write(msg);
}
}
Note: This will definitely not work as-is. Just a suggestion of a design. Someone is welcome to edit this.
If you need to read and write same time you either have to use threads (different threads reading and writing) or asynchronous I/O (the java.nio package). Using input and output streams from different threads is not a problem.
If you want to make a streaming API in java, you should usually provide InputStream for reading and OutputStream for writing. This way those can then be passed for other APIs so that you can chain things and so get the streams go all the way as streams.
Input example:
Message message = new Message(inputStream);
results = process(message.getInputStream());
Output example:
Message message = new Message(outputStream);
writeContent(message.getOutputStream());
The message needs to wrap the given streams with a classes that do the needed encryption and decryption.
Note that reading multiple messages at same time or writing multiple messages at same time would need support from the protocol too. You need to get the synchronization correct.
You should check Wikipedia article on different block cipher modes supporting encryption of streams. The different encryption algorithms may support a subset of these.
Buffered streams will allow you to read, encrypt/decrypt and write in a loop.
Examples demonstrating ZipInputStream and ZipOutputStream could provide some guidance on how you may solve this. See example.
What you need is using Cipher Streams (CipherInputStream). Here is an example of how to use it.
I agree with Arne, the data processor shouldn't know about encryption, it just needs to read the decrypted body of the message, and write out the results, and stream filters should take care of encryption. However, since this is logically operating on the same piece of information (a Message), I think they should be packaged inside one class which handles the message format, although the encryption/decryption streams are indeed independent from this.
Here's my idea for the structure, flipping the architecture around somewhat, and moving the Message class outside the encryption streams:
class Message {
InputStream input;
Envelope envelope;
public Message(InputStream input) {
assert input != null;
this.input = input;
}
public Message(Envelope envelope) {
assert envelope != null;
this.envelope = envelope;
}
public Envelope getEnvelope() {
if (envelope == null && input != null) {
// Read envelope from beginning of stream
envelope = new Envelope(input);
}
return envelope
}
public InputStream read() {
assert input != null
// Initialise the decryption stream
return new DecryptingStream(input, getEnvelope().getEncryptionParameters());
}
public OutputStream write(OutputStream output) {
// Write envelope header to output stream
getEnvelope().write(output);
// Initialise the encryption
return new EncryptingStream(output, getEnvelope().getEncryptionParameters());
}
}
Now you can use it by creating a new message for the input, and one for the output:
OutputStream output; // This is the stream for sending the message
Message inputMessage = new Message(input);
Message outputMessage = new Message(inputMessage.getEnvelope());
process(inputMessage.read(), outputMessage.write(output));
Now the process method just needs to read chunks of data as required from the input, and write results to the output:
public void process(InputStream input, OutputStream output) {
byte[] buffer = new byte[1024];
int read;
while ((read = input.read(buffer) > 0) {
// Process buffer, writing to output as you go.
}
}
This all now works in lockstep, and you don't need any extra threads. You can also abort early without having to process the whole message (if the output stream is closed for example).