Too much data on Node.js socket? - java

I'm currently developing a system that gets data from a battery pack of an electric vehicle, stores it in a database and display it on a screen.
So I have a Java - Application that reads the data from a hardware interface, interprets the values and sends it via Socket to a Node.js-Server. (Java App and Webserver are running on the same computer, so Url = localhost)
JAVA APP:
s = new Socket();
s.connect(new InetSocketAddress(URL, PORT));
out = new PrintWriter( s.getOutputStream(), true);
for (DataEntry e : entries){
out.printf(e.toJson());
}
NODE:
sock.on('data', function(data) {
try{
var data = JSON.parse(data);
db.serialize(function(){
db.run("INSERT INTO DataEntry(value, MessageField, time) values(" + data.value + "," + data.messageFieldID + ", STRFTIME('%Y-%m-%d %H:%M:%f'))");
});
} catch(e){}
});
I get about 20 Messages per second from the hardware interface which are converted into 100 Json - Strings. So the webserver has to process one message in 10 ms, which I thought, is manageable.
But here is the problem: If my entries - List (foreach loop) has more than 2 elements, the webserver gets 2 or more of the Json's in one message.
So the first message was divided into 2 parts (ID 41,42) and was processed correctly. But the second message was divided into 5 parts (ID 43-47), and the first 4 of them weren't sent alone, so only the last one was saved correctly.
How can I ensure, that every Json is sent one another?
Isn't there something like a buffer so that the socket.on method is called correctly for every message I send?
I hope somebody of you can help me
Thank you!
Benedikt :)

TCP sockets are just streams and you shouldn't make any assumptions about how much of a "message" is contained in a single packet.
A simple solution to this is to terminate each message with a newline character since JSON cannot contain such a character. From there it's a simple matter of buffering data until you see a newline character. Then call JSON.parse() on your buffer. For example:
var buf = '';
sock.on('data', function(data) {
buf += data;
var p;
// Use a while loop as it may be possible to have multiple
// messages buffered depending on chunk contents
while (~(p = buf.indexOf('\n'))) {
try {
var msg = JSON.parse(buf.slice(0, p));
} catch (ex) {
console.log('Bad JSON message: ' + ex);
}
buf = buf.slice(p + 1);
}
});
You will also need to change printf() to println() on the Java-side so that a newline character will be appended to each message.

Related

Image loads as base64 string when send via http/1.0?

I am trying to implement http/1.0 in a project with a website that's loaded with a serversocket i've coded. It works fine with character based files. But with image files that i've specified to return the base64 encoded version of the image doesn't work even though the right headers are set such as content-type: image/png and content-transfer-encoding: base64 RFC 2045. I've tried to look at the packets from chrome's networking tool and it looks like it's treating it as a document event though it's an image file. I have no clue whatsoever to do since i've been stuck on this issue for a couple of DAYS! I've searched all of stackoverflow, all of google and i am basically stuck.
I posted this question a day or 2 ago where it was recommended to use a byte reader (which i've also tried) without luck. Any visual inputs are of great appreciation.
I have 2 methods that are relevant.
The first one is the one where i choose the way to read the file depending on if it's an image or text.
public String readUri(String reqUri) {
returnFile = "";
if (this.fileExists(reqUri)) {
fileType = this.fileType(reqUri); // returns e.g image from image/png
if (fileType.equals("text")) {
// bufferedreader ...
} else if (fileType.equals("image")) {
File imgPath = new File(reqUri);
try {
FileInputStream fileInputStreamReader = new FileInputStream(imgPath);
byte[] bytes = new byte[(int)imgPath.length()];
fileInputStreamReader.read(bytes);
returnFile = Base64.getEncoder().encodeToString(bytes);
fileInputStreamReader.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
}
return returnFile;
}
The second one collects this data from the above method. This method is called in my get request controller and sends back the data to the client through the serversocket.
StringBuilder response = new StringBuilder();
public String response(
String HTTPVersion, int statusCode, String fileContent, String contentType) {
response.append(
HTTPVersion + " " +
statusCode + " " +
this.getHTTPStatusText(statusCode) + "\n"
);
response.append("Content-transfer-encoding: BASE64");
response.append("Content-Type: " + contentType + "\n");
response.append("content-length: " + fileContent.length() + "\n");
response.append("Date: " + date() + "\n");
response.append("\n");
response.append(fileContent + "\n");
return response.toString();
}
Here is a request/response from chromes networking tool:
This is how the image is currently loaded with the base64 encoding:
HTTP IS NOT MIME
RFC 2045 is MIME, and although HTTP is similar in some respects to MIME, it is not MIME, and it differs in other respects. In particular it DOES NOT USE Content-Transfer-Encoding. It DOES USE Content-Encoding with a similar meaning. See https://www.rfc-editor.org/rfc/rfc1945#section-10.3 and https://www.rfc-editor.org/rfc/rfc1945#appendix-C.3 et seq.
Also, you are terminating the lines of the response header with only Java \n which is LF. The standards call for CR LF (Java \r\n) and always have. Some receivers are tolerant, following Postel's dictum, but you shouldn't rely on that. And worse your code doesn't appear to terminate the CTE line at all, although since Chrome parsed it okay I'm guessing you just posted the wrong code. Also you should NOT add a line terminator after the body that isn't counted in Content-Length, although if you are using original HTTP/1.0, i.e. without keepalive, this won't matter, because there can't be another request and response on the same transport connection.

Blocking read function in c in a client/server application

I'm currently developing a client/server architecture between a tablet (client) and a MAC/PC (server). I am doing on both side some real-time rendering and I need communication between the two.
The problem is that I need to do some operation on the string I get from my client (which is basically a rotation matrix). This string is therefore to be at most 16 float numbers that I previously transform into a coma-separated-value string.
Therefore what I should get from my client is something like:
1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0
Server-side, I do some processing of that string to get back my rotation matrix as a float array of 16 elements. The problem is that sometimes I get more than just 16 elements from the client on the server side at once. I for instance get
1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0
1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0
1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0
So that when I try to split it, I go above the 16 element limits which is not good at all for me.
My question is: is there a way to prevent the server and/or the client to read/send more than one complete matrix at a time? Since I'm using a tablet and some real-time rendering I would like to be able to save as much processing power as possible.
Here is the code that I'm using (just snippets as files are quite big)
Client:
if (connected == true && matrixupdated == true && this.hasMatrixChanged()){
try {
this.inFromServer = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
this.outToServer= new DataOutputStream(clientSocket.getOutputStream());
this.sentence = this.getStringFromMatrix();
outToServer.writeBytes(sentence + '\n');
this.hasServerProcessed = false ;
System.arraycopy(matrix, 0, previousMatrix, 0, 16); //I check whether the matrix changes enough for me to send it to the server
}catch (Exception e) {
Log.e("ClientActivity", "S: Error", e);
}
this.matrixupdated = false ;
Server :
while( (read_size = recv(sock , client_message , 2000 , 0)) > 0 )
{
smatrix = client_message ; //smatrix is a true c++ string
pthread_mutex_lock(&mymutex);
pthread_cond_wait(&mycondition, &mymutex); // prevent real-time rendering to try and use the matrix at the same time as this function
std::stringstream ss(smatrix);
while(std::getline(ss, tok, ',')) {
matrix[i] = ::atof(tok.c_str());
i++ ;
}
i = 0 ;
pthread_mutex_unlock(&mymutex);
}
Working as designed. TCP is a byte stream protocol. There are no message boundaries. If you want messages you have to implement them yourself.

Maximum as3 adobe JSON string length

I've written Socket Communication server with Java and a AIR programm with AS3, using Socket connection.
The communication through socket connection is done with JSON serialization.
Sometimes with really long JSON strungs over socket, AS3 code says that there is a JSON parse error.
Each JSON string I end with end string to let programm know, that it is not the end of the message, so this is not the problem with AIR programm reading the message in parts.
The error occurs only with realy long json string, for example, string with 78031 length. Is there any limits for JSON serialization?
I had the same problem. The problem is in Flash app reading data from socket.
The point is that Flash ProgressEvent.SOCKET_DATA event fires even when server didn't send all the data, and something is left (especially when the data is big and the connection is slow).
So something like {"key":"value"} comes in two (or more) parts, like: {"key":"val and ue"}. Also sometimes you might receive several joined JSONs in one message like {"json1key":"value"}{"json2key":"value"} - built-in Flash JSON parser cannot handle these too.
To fight this I recommend you to modify your SocketData handler in the Flash app to add a cache for received strings. Like this:
// declaring vars
private var _socket:Socket;
private var _cache: String = "";
// adding EventListener
_socket.addEventListener(ProgressEvent.SOCKET_DATA, onSocketData);
private function onSocketData(e: Event):void
{
// take the incoming data from socket
var fromServer: ByteArray = new ByteArray;
while (_socket.bytesAvailable)
{
_socket.readBytes(fromServer);
}
var receivedToString: String = fromServer.toString();
_cache += receivedToString;
if (receivedToString.length == 0) return; // nothing to parse
// convert that long string to the Vector of JSONs
// here is very small and not fail-safe alghoritm of detecting separate JSONs in one long String
var jsonPart: String = "";
var jsonVector: Vector.<String> = new Vector.<String>;
var bracketsCount: int = 0;
var endOfLastJson: int = 0;
for (var i: int = 0; i < _cache.length; i++)
{
if (_cache.charAt(i) == "{") bracketsCount += 1;
if (bracketsCount > 0) jsonPart = jsonPart.concat(_cache.charAt(i));
if (_cache.charAt(i) == "}")
{
bracketsCount -= 1;
if (bracketsCount == 0)
{
jsonVector.push(jsonPart);
jsonPart = "";
endOfLastJson = i;
}
}
}
// removing part that isn't needed anymore
if (jsonVector.length > 0)
{
_cache = _cache.substr(endOfLastJson + 1);
}
for each (var part: String in jsonVector)
{
trace("RECEIVED: " + part); // voila! here is the full received JSON
}
}
According to Adobe, it appears that you are not facing a JSON problem but instead a Socket limitation.
A String you may send over a Socket via writeUTF and readUTF is limited by 65,535 bytes. This is due to the string being prepended with a 16 bit unsigned integer rather than a null terminated string.

Java sockets and dataoutputstream delay

I am a beginner in Java and I am working at a project involving sockets and external devices.
I have done a small listener for the units and I am having some problems sending messages to the devices.
The unit has 2 messages that is sending to the server:
1. the Heart Beat message witch is in hex
2. the actual data which is a string and looks like: field1,field2,field3,field4
The listener is opening a new thread for each device and is wait for messages.
I use an DataInputStream for receiving the messages and DataOutputStream for sending the messages. Each time a unit is connecting to the server, it opens a new thread and gets the heart beat, from that heart beat I know what type of device is and who is it. If the device is new I will configure it based on 2 unique serials which the unit knows to send me and I have to compare them to the database.
My problem is in the part in which I ask the device for the 2 serials. It takes a long time 30 to 40 seconds. It has to be under 10 seconds for send the command (tested on another tcp server).
Here is that part of code:
String [] commands = deviceDetails.getConfigCommands().split(";"); // I take the commands from a file, and looks like: command1;command2
for (int i=0; i<commands.length; i++) {
sendCommand = commands[i].getBytes(); // sendCommand is a byte[]
out.write(sendCommand);
out.flush();
loop: while (true) {
in.read(buffer); // buffer is a byte[]
String[] result = new String(buffer, "ASCII").split(":|="); // the response from the unit looks like: ok:commandname=code, i need the code
if (configCounter ==2 && imei.equals("")) {
i = 0;
break loop;
} else if(configCounter == 2 && simid.equals("")) {
i = 1;
break loop;
}
configCounter++;
switch (result[1]) {
case "IMEI":
imei = result[2];
break loop;
case "SIMID":
simid = result[2];
break loop;
}
}
}
in the while I am waiting for the response from the unit. but before that I am trying to send the command to the unit and it takes about 30-40 second for each command.
Can anyone tell me what's the problem in the code above? How can I make the DataOutputStream send the command faster than 30-40 seconds??
Thank you!

Processing / Reading Serial Port Data Streams Crashing Program

I'm working on a simple program to read a continuous stream of data from a plain old serial port. The program is written in Processing. Performing a simple read of data and dumping into to the console works perfectly fine, but whenever I add any other functionality (graphing,db entry) to the program, the port starts to become de-synchronized and all data from the serial port starts to become corrupt.
The incoming data from the serial port is in the following format :
A [TAB] //start flag
Data 1 [TAB]
Data 2 [TAB]
Data 3 [TAB]
Data 4 [TAB]
Data 5 [TAB]
Data 6 [TAB]
Data 7 [TAB]
Data 8 [TAB]
COUNT [TAB] //count of number of messages sent
Z [CR] //end flag followed by carriage return
So as stated, if I run the program below and simply have it output to the console, it runs fine without issue for several hours. If I add the graphing functionality or database connectivity, the serial data starts to come in garbled and serial port handler is never able to decode the message correctly again. I've tried all sorts of workarounds to this issue, thinking it is a timing problem but reducing the speed of the serial port doesn't seem to make a change.
If you see the serial port handler, I provide a large buffer just in case the terminating Z character is chopped off. I check to make sure the A and Z characters are in the correct place and in turn that the created "substring" is the correct length. When the program starts to fail, the substring will continuously fail this check until the program just crashes. Any ideas? I've tried several different ways of reading the serial port and am just beginning to wonder if I am missing something stupid here.
//Serial Port Tester
import processing.serial.*;
import processing.net.*;
import org.gwoptics.graphics.graph2D.Graph2D;
import org.gwoptics.graphics.graph2D.traces.ILine2DEquation;
import org.gwoptics.graphics.graph2D.traces.RollingLine2DTrace;
import de.bezier.data.sql.*;
SQLite db;
RollingLine2DTrace r1,r2,r3,r4;
Graph2D g;
Serial mSerialport; //the serial port
String[] svalues = new String[8]; //string values
int[] values = new int[8]; //int values
int endflag = 90; //Z
byte seperator = 13; //carriage return
class eq1 implements ILine2DEquation {
public double computePoint(double x,int pos) {
//data function for graph/plot
return (values[0] - 32768);
}
}
void connectDB()
{
db = new SQLite( this, "data.sqlite" );
if ( db.connect() )
{
db.query( "SELECT name as \"Name\" FROM SQLITE_MASTER where type=\"table\"" );
while (db.next())
{
println( db.getString("Name") );
}
}
}
void setup () {
size(1200, 1000);
connectDB();
println(Serial.list());
String portName = Serial.list()[3];
mSerialport = new Serial(this, portName, 115200);
mSerialport.clear();
mSerialport.bufferUntil(endflag); //generate serial event when endflag is received
background(0);
smooth();
//graph setup
r1 = new RollingLine2DTrace(new eq1(),250,0.1f);
r1.setTraceColour(255, 0, 0);
g = new Graph2D(this, 1080, 500, false);
g.setYAxisMax(10000);
g.addTrace(r1);
g.position.y = 50;
g.position.x = 100;
g.setYAxisTickSpacing(500);
g.setXAxisMax(10f);
}
void draw () {
background(200);
//g.draw(); enable this and program crashes quickly
}
void serialEvent (Serial mSerialport)
{
byte[] inBuffer = new byte[200];
mSerialport.readBytesUntil(seperator, inBuffer);
String inString = new String(inBuffer);
String subString = "";
int startFlag = inString.indexOf("A");
int endFlag = inString.indexOf("Z");
if (startFlag == 0 && endFlag == 48)
{
subString = inString.substring(startFlag+1,endFlag);
}
else
{
println("ERROR: BAD MESSAGE DISCARDED!");
subString = "";
}
if ( subString.length() == 47)
{
svalues = (splitTokens(subString));
values = int(splitTokens(subString));
println(svalues);
// if (db.connect()) //enable this and program crashes quickly
// {
// if ( svalues[0] != null && svalues[7] != null)
// {
// statement = svalues[7] + ", " + svalues[0] + ", " + svalues[1] + ", " + svalues[2] + ", " + svalues[3] + ", " + svalues[4] + ", " + svalues[5] + ", " + svalues[6];
// db.execute( "INSERT INTO rawdata (messageid,press1,press2,press3,press4,light1,light2,io1) VALUES (" + statement + ");" );
// }
// }
}
}
While I'm not familiar with your specific platform, my first thought from reading your problem description is that you still have a timing problem. At 115,200bps, data is coming in rather quickly-- more than 10 characters every millisecond. As such, if you spend precious time opening a database (slow file IO) or drawing graphics (also potentially slow), you might well not be able to keep up with the data.
As such, it might be a good idea to put the serial port processing on its own thread, interrupt, etc. That might make the multitasking much easier. Again, this is just an educated guess.
Also, you say that your program "crashes" when you enable the other operations. Do you mean that the entire process actually crashes, or that you get corrupted data, or both? Is it possible that you are overrunning your 200 byte inBuffer[]? At 115kbps, it wouldn't take but 20ms to do so.

Categories

Resources