I am trying to transfer an image saved to a raspberry pi from the rpi to an android application to display it. I am treating the raspberry pi as the server and the android app as the client. My server side is implemented using python. I am using pylab to save a figure to the raspberry pi and then later open the image and read the contents as a byte array. This byte array is then passed to the android app, written in java.
I can view the image on the rpi when I open it, but once it is sent to the android, something is happening to it that causes the incorrect number of bytes to be read and the image to be corrupted. I realized that java reads big endian while the raspberry pi byte order is little endian. I am not sure if this is what is causing the problem in transferring the image?
So, I am wondering if there is a way to either encode the byte array as big endian when it is sent from python or a way to decode the byte array as little endian when it is received by java. I tried simply reversing the image byte array in python but that did not work. Any suggestions would be very helpful!
I am not an expert in hardware differences between a Pi and other platforms, but this process can be performed using ByteBuffer.
You can get an instance of ByteBuffer using ByteBuffer.wrap(byteArray) or ByteBuffer.allocate(capacity).
You can then set the endian-ness of the buffer using buffer.order(ByteOrder.BIG_ENDIAN); or buffer.order(ByteOrder.LITTLE_ENDIAN);
The data can then be returned with buffer.get().
Related
I have a webservice sending a huge JSON text to an Android app. There's about 20000 ID numbers. Unfortunately but perhaps not surprisingly it's timing out.
What options do I have? The easiest one that comes to mind is somehow compressing this data. Is there any way I can do this (PHP webservice, Java Android app) effectively?
Failing that, is there some technique to send JSON in parts? If so, how does that work? At what point is JSON considered too big to send in one part? Thank you
You can use GZIP in php and send as stream to client and then decode data with java in android
you can use this for gzip in php: GZIP
and Gzip in Android : GZIP
You could compress data with ob_gzhandler(). Put this call in your script before any output:
ob_start('ob_gzhandler');
After that, output will be compressed with gzip.
This is not a good solution, indeed. You should split JSON and send it as sequential smaller pieces. Otherwise, what will you do when even compressed data is too big?
I have a server program written in c# that continuously take screen shots and send it to my android device... i want those screen shot images to compress on server side written in c# and then decompress my client side written in java... any idea please!!!!!!
I have an simple understanding question: Why can I only send Binary Data from Android to a FTP server? I am currently learning it and now I understood it. With ByteArrayInputStream I am converting my string into bytes to send it binary. But why binary?
In a digital computer, all data are binary.
Are you talking about a lack of the "ASCII" mode that some FTP clients support? Those have always been problematic in the way they tend to corrupt data. Making an exact copy, without alterations, is a lot safer.
Everything is binary in the computer world.
I am using the JSpeex API to convert a .wav file into .spx file. Everything goes perfect when tested on desktop; it took only 2 seconds.
Android developer used the same code but it took around 3 minutes to encode the same file on their simulator & phone. Is there any way to reduce this time for encoding? Code used to convert is as follows:
new JSpeexEnc().encode(new File("source.wav"), new File("dest.spx"));
Compression takes time. The better the compression, the longer it takes, and Speex is pretty good compression.
2 seconds of desktop computer time is absolutely ages.
JSpeex is a java implementation. Use a native implementation, ideally use the platform codecs, instead.
On phones, speech is best compressed using AMR - not necessarily the best quality/compression, but most likely hardware accelerated since its the format used by GSM. You can usually get AMR straight from the microphone.
How do you get large WAV files onto an Android device in the first place? If its actually the output of the microphone, consider using AMR as outlined above.
If you need Speex and you have a wav file, then consider sending it to a server for compression.
I'm trying write a java program to send live microphone data over UDP, then receive the data in VLC. I'm basically using the same code as in this post to package up the stream and send them over. When I receive the data in VLC, I get nothing. I see a bunch of input coming in but none of it is interpreted as audio data. It tries to resolve the information as mpga or mpgv, but I'm pretty sure it's being sent as raw audio. Is the problem on VLC's end? Should I configure VLC to receive a specific format? Or is the problem with my program not packaging the data in a way VLC can interpret it?
First thing you should do is capture the live microphone data to a file and figure out exactly what format it is. Then transfer the file to VLC (if that makes sense) to see if VLC can cope with it in that form.
If you are going to use UDP in the long term, you need to be sure that the audio format you are using can cope with the loss of chunks of data in the middle of the audio stream due to network packet loss. If not, you should use TCP rather than UDP.