When I read this link: http://docs.oracle.com/javase/7/docs/api/java/util/zip/CRC32.html
There is method to get CRC32 value from array of bytes. However, when I send this CRC32ed value to receiver, how can receiver check if this value is valid?
The idea is that the receiver will run the same algorithm on the data you sent through, and it should get the same CRC32 value. If not, the data (or CRC) has been corrupted somehow.
In other words, let's say your payload is "Pax is an incredibly handsome guy" and you calculate the CRC as 42.
You send through both that payload and the CRC to the receiving end and it performs the same actions on the payload, also coming up with 42.
That means the content of that payload is correct. But not necessarily the veracity :-)
That's what CRC stands for, cyclic redundancy check.
Keep in mind that CRCs tend to be effective against accidental damage to payload but, unless there's some secret information being used to generate the check, it can be faked.
Related
I'm new in VoIP. I was wondering to create an application that streams audio using Client/Server architecture using RTP. Different APIs are available but I need to get understanding at core level. I have studied RFC. Can anybody suggest me how to make an audio RTP packet and send it to the server along with the unpacking of the packet in JAVA.
Thanking in advance.
Create an empty Java class.
Add members for all the fields in the RTP header. Use bool for the single bit fields. For the numerical fields, you need to be mindful of how many bits you need, i.e. use int for SSRC, and timestamp, short for sequence, byte for payload type, etc. CSRC should be an array (or arrayList or whatever) of ints. Audio payload should be a byte array.
Packets are just arrays of bytes, so you need a ToBytes() method that will output a packet byte[], and a constructor that takes a byte[] as a parameter. To send a packet, call ToBytes() and put the result in a UDP packet.
In your ToBytes() method, create a byte array that's 12 bytes, plus an additional 4 bytes for each CSRC, plus however many bytes are in your audio payload.
Single bit values you need to set using the bitwise OR operator. For example, the Marker bit is the first bit of the second byte, so to set it:
if(marker)
{
bytes[1] = bytes[1] | 0x80; //0x80 is 1000 0000
}
To set the values that are ints or shorts, you'll need to convert the value to a network order (bigendian) byte array and then set it in the buffer using arraycopy. I'll leave it up to you to find out how to create the network order byte array.
For the constructor that takes a byte[], you'll need to do the above process in reverse. To check the value of a single bit, use the AND operator, e.g.:
marker = bytes[1] & 0x80 == 1;
Either in this class or a helper class you'll probably need methods to help set the timestamp based on the packet count and sample rate. For example, if the payload is G.711, it's 8000 samples/second, which means a packet sent every 20ms with a 160 byte payload, so timestamp will increase by 160 every packet.
In Java, I am polling a WebSphere MQ message queue, expecting a message of `STRING format, that is composed entirely of XML. Part of this XML will contain bytes to a file attachment (any format: pdf, image, etc) which will then be converted to a blob for storage in an Oracle Db and later retrieval.
The issue I am having is that the known size of example files being sent over end up in my Db with a different size. I am not adding anything to the bytes (as far as I know), and the size appears to be larger directly after I get the message. I cannot determine if I am somehow adding information at retrieve, conversion from bytes -> String, or if this is happening on the front end when the sender populates the message.
My code at retrieve of the message:
inboundmsg = new MQMessage();
inboundmsg = getMQMsg(FrontIncomingQueue, gmo);
strLen = inboundmsg.getMessageLength();
strData = new byte[strLen];
ibm_id = inboundmsg.messageId;
inboundmsg.readFully(strData);
inboundmsgContents = new String(strData);
I see a file known to have size 21K go to 28K. A coworker has suggested that charset/encoding may be the issue. I do not specify a charset in the constructor call to String above, nor in any of the calls to getBytes when converting back from a string (for other unrelated uses). My default charset is ISO-8859-1. When speaking with the vendor who is initiating the message transfer, I asked her what charset she is using. Her reply:
"I am using the File.WriteAllBytes method in C# - I pass it the path of my file and it writes it to a byte[]. I could not find any documentation on the MSDN about what encoding the function uses. The method creates a byte array and from what I have read online this morning there is no encoding, its just a sequence of 8bit unsigned binary data with no encoding."
Another coworker has suggested that perhaps the MQ charset is the culprit, but my reading of the documentation suggests that MQ charset only affects the behavior of readString, readLine, & writeString.
If I circumvent MQ totally, and populate a byte array using a File Input Stream and a local file, the file size is preserved all the way to Db store, so this definitely appears to be happening at or during message transfer.
The problem is evident in the wording of the question. You describe a payload that contains arbitrary binary data and also trying to process it as a string. These two things are mutually exclusive.
This appears to be complicated by the vendor not providing valid XML. For example, consider the attachment:
<PdfBytes>iVBORw0KGgoAAAANS … AAAAASUVORK5CYII=</PdfBytes>
If the attachment legitimately contains any XML special character such as < or > then the result is invalid XML. If it contains null bytes, some parsers assume they have reached the end of the text and stop parsing there. That is why you normally see any attachment in XML either converted to Base64 for transport or else converted to hexadecimal.
The vendor describes writing raw binary data which suggests that what you are receiving contains non-string characters and therefore should not be sent as string data. If she had described some sort of conversion that would make the attachment XML compliant then string would be appropriate.
Interestingly, a Base64 encoding results in a payload that is 1.33 times larger than the original. Coincidence that 21k * 1.3 = 28k? One would think that what is received is actually the binary payload in Base64 format. That actually would be parseable as a string and accounts for the difference in file sizes. But it isn't at all what the vendor described doing. she said she's writing "8bit unsigned binary data with no encoding" and not Base64.
So we expect it to fail but not necessarily to result in a larger payload. Consider that WebSphere MQ receiving a message in String format will attempt to convert it. If the CCSID of the message differs from that requested on the GET then MQ will attempt a conversion. If the inbound CCSID is UTF-16 or any double-byte character set, certain characters will be expanded from one to two bytes - assuming the conversion doesn't hit invalid binary characters that cause it to fail.
If the two CCSIDs are the same then no conversion is attempted in the MQ classes but there is still an issue in that something has to parse an XML payload that is by definition not valid and therefore subject to unexpected results. If it happens that the binary payload does not contain any XML special characters and the parser doesn't choke on any embedded null bytes, then the parser is going to rather heroic lengths to forgive the non-compliant payload. If it gets to the </PdfBytes> tag without choking, it may assume that the payload is valid and convert everything between the <PdfBytes>...</PdfBytes> tags itself. Presumably to Base64.
All of this is conjecture, of course. But in a situation where the payload is unambiguously not string data any attempt to parse it as string data will either fail outright or produce unexpected and potentially bizarre results. You are actually unfortunate that it doesn't fail outright because now there's an expectation that the problem is on your end when it clearly appears to be the vendor's fault.
Assuming that the content of the payload remains unchanged, the vendor should be sending bytes messages and you should be receiving them as bytes. That would at least fix the problems MQ is having reconciling the expected format with the actual received format, but it would still be invalid XML. If it works that the vendor sends binary data in a message set to type String with you processing it as bytes then count your blessings and use it that way but don't count on it being reliable. Eventually you'll get a payload with an embedded XML special character and then you will have a very bad day.
Ideally, the vendor should know better than to send binary data in an XML payload without converting it first to string and it is up to them to fix it so that it is compliant with the XML spec and reliable.
Please see this MSDN page: XML, SOAP, and Binary Data
I have a multi-threaded client-server application that uses Vector<String> as a queue of messages to send.
I need, however, to send a file using this application. In C++ I would not really worry, but in Java I'm a little confused when converting anything to string.
Java has 2 byte characters. When you see Java string in HEX, it's usually like:
00XX 00XX 00XX 00XX
Unless some Unicode characters are present.
Java also uses Big endian.
These facts make me unsure, whether - and eventually how - to add the file into the queue. Preferred format of the file would be:
-- Headers --
2 bytes Size of the block (excluding header, which means first four bytes)
2 bytes Data type (text message/file)
-- End of headers --
2 bytes Internal file ID (to avoid referring by filenames)
2 bytes Length of filename
X bytes Filename
X bytes Data
You can see I'm already using 2 bytes for all numbers to avoid some horrible operations required when getting 2 numbers out of one char.
But I have really no idea how to add the file data correctly. For numbers, I assume this would do:
StringBuilder packetData = new StringBuilder();
packetData.append((char) packetSize);
packetData.append((char) PacketType.BINARY.ordinal()); //Just convert enum constant to number
But file is really a problem. If I have also described anything wrongly regarding the Java data types please correct me - I'm a beginner.
Does it have to send only Strings? I think if it does then you really need to encode it using base64 or similar. The best approach overall would probably be to send it as raw bytes. Depending on how difficult it would be to refactor your code to support byte arrays instead of just Strings, that may be worth doing.
To answer your String question I just saw pop up in the comments, there's a getBytes method on a String.
For the socket question, see:
Java sending and receiving file (byte[]) over sockets
I am developing a Java-based downloader for binary data. This data is transferred via a text-based protocol (UU-encoded). For the networking task the netty library is used. The binary data is split by the server into many thousands of small packets and sent to the client (i.e. the Java application).
From netty I receive a ChannelBuffer object every time a new message (data) is received. Now I need to process that data, beside other tasks I need to check the header of the package coming from the server (like the HTTP status line). To do so I call ChannelBuffer.array() to receive a byte[] array. This array I can then convert into a string via new String(byte[]) and easily check (e.g. compare) its content (again, like comparison to the "200" status message in HTTP).
The software I am writing is using multiple threads/connections, so that I receive multiple packets from netty in parallel.
This usually works fine, however, while profiling the application I noticed that when the connection to the server is good and data comes in very fast, then this conversion to the String object seems to be a bottleneck. The CPU usage is close to 100% in such cases, and according to the profiler very much time is spent in calling this String(byte[]) constructor.
I searched for a better way to get from the ChannelBuffer to a String, and noticed the former also has a toString() method. However, that method is even slower than the String(byte[]) constructor.
So my question is: Does anyone of you know a better alternative to achieve what I am doing?
Perhaps you could skip the String conversion entirely? You could have constants holding byte arrays for your comparison values and check array-to-array instead of String-to-String.
Here's some quick code to illustrate. Currently you're doing something like this:
String http200 = "200";
// byte[] -> String conversion happens every time
String input = new String(ChannelBuffer.array());
return input.equals(http200);
Maybe this is faster:
// Ideally only convert String->byte[] once. Store these
// arrays somewhere and look them up instead of recalculating.
final byte[] http200 = "200".getBytes("UTF-8"); // Select the correct charset!
// Input doesn't have to be converted!
byte[] input = ChannelBuffer.array();
return Arrays.equals(input, http200);
Some of the checking you are doing might just look at part of the buffer. If you could use the alternate form of the String constructor:
new String(byteArray, startCol, length)
That might mean a lot less bytes get converted to a string.
Your example of looking for "200" within the message would be an example.
2
You might find that you can use the length of the byte array as a clue. If some messages are long and you are looking for a short one, ignore the long ones and don't convert to characters. Or something like that.
3
Along with what #EricGrunzke said, partially looking in the byte buffer to filter out some messages and find that you don't need to convert them from bytes to characters.
4
If your bytes are ASCII characters, the conversion to characters might be quicker if you use charset "ASCII" instead of whatever the default is for your server:
new String(bytes, "ASCII")
might be faster in that case.
In fact, you might be able to pick and choose the charset for conversion byte-character in some organized fashion that speeds up things.
Depending on what you are trying to do there are a few options:
If you are just trying to get the response status to then can't you just call getStatus()? This would probably be faster than getting the string out.
If you are trying to convert the buffer, then, assuming you know it will be ASCII, which it sounds like you do, then just leave the data as byte[] and convert your UUDecode method to work on a byte[] instead of a String.
The biggest cost of the string conversion is most likely the copying of the data from the byte array to the internal char array of the String, this combined with the conversion is most likely just a bunch of work that you don't need to do.
How to use User Data(UDH) in smpp? Now I want to send sms specified port.
I use OpenSMPP as my project lib..
Follow these steps to send UDH over SMPP:
Set the UDHI bit to 1 on the esm_class field. The simplest way to do this - esm_class = esm_class | 0x40.
Put UDH at the beginning of short_message field. Read on for a quick summary. See the references to know in details how to encode UDH.
Here is how to encode UDH:
The first byte of UDH must mention the length (in bytes) of remaining part of UDH. Since you may not know this ahead, you may have to calculate it later on.
Then follows one or more IE (Information Element). Each IE has 3 parts:
First byte: IEI (IE Indicator). Identifies the element you want to encode. There are established IEI.
Second byte: IEIDL (IEI Data Length). Identifies the number of bytes that holds the data part. Each established IEI has fixed value for this field.
Third byte and rest part: IEID (IEI Data): Holds the data part. Each established IEI has fixed format of the data.
Count the total bytes consumed by each IE and put the result in the first byte.
For sending SMS to a part, you can use IEI 0x04 or 0x05. I have only seen 0x05 being used.
References
3GPP specification: 23.040
User Data Header