I am developing an app which sends may be like 400kb of data in the interval of every 5seconds ,i have read somewhere that we can conpress the data in php using gz but how to measure the compression and how to deconpress it in java /android .
Related
Problem Statement:
I´am uploading an image uri base64 from my react-native app to my java backend server. My backend converts the URI String to a byte array and stores it in the MySQL Database (with a BLOB). So far it´s all fine! But when I'am reading/fetching the images from the database I convert them back to a base64 image uri string, to show them to the user (fetching with my Rest api). The problem is, that my Rest API (GET) can handle like 2-3 images and then it runs out of memory... What can I do? It´s because the base64 uri strings are obviously too long for the Rest API...
Any resolution?
In your backend, you should store images as files, not byte array. Convert b64 to file in java with something like this (I don't personally know how to do it)
Once you've done this, your backend has to return you the file's url so you can display it in your app with the Image component from react-native.
To export json array to an excel file,
taking in mind big amount of data and performance, which approach is preferred:
Generate the file in the back end and download it.
Generate the file on the client side (Browser) using a library like XLSX to convert json array to excel.
Out of your 2 options it should be on backend side. But what I suggest is that if you are expecting that there will be huge data then you keep on sending 100s or 1000s of records in chunk and create the file in backend. And once it is done you can download entire file.
I'm studying about Networking .I had sent a file image between two pc by ICMP then capture by wireshark. I'm try to get raw data from pcap file and decode it to get data that I just had sent by java. I had taken a lot of times but I dont know how i can do it. I really wanna you do me a favor. Sorry because of English not my mother tongue.
I have a webservice sending a huge JSON text to an Android app. There's about 20000 ID numbers. Unfortunately but perhaps not surprisingly it's timing out.
What options do I have? The easiest one that comes to mind is somehow compressing this data. Is there any way I can do this (PHP webservice, Java Android app) effectively?
Failing that, is there some technique to send JSON in parts? If so, how does that work? At what point is JSON considered too big to send in one part? Thank you
You can use GZIP in php and send as stream to client and then decode data with java in android
you can use this for gzip in php: GZIP
and Gzip in Android : GZIP
You could compress data with ob_gzhandler(). Put this call in your script before any output:
ob_start('ob_gzhandler');
After that, output will be compressed with gzip.
This is not a good solution, indeed. You should split JSON and send it as sequential smaller pieces. Otherwise, what will you do when even compressed data is too big?
I'm using the Microsoft Project Oxford Speaker Recognition API REST, in order to create an enrollment I need to send a Binary Data of a .wav file. I already have the class that records and saves the .wav file, now I have to POST it, I just dunno how do I kinda "decode" the .wav file I have to that Binary Data that I want...
Appreciate any help.
Here's the link to what I'm trying to do: Speaker Recognition Create Enrollment.
In the documentation of the API, it is mentioned that the "body" of the request is the raw binary data for the *.wav file that you have recorded. This means you just need to send the file as is without any decoding.