Use android.graphics.Bitmap from Java without Android - java

I need to use android.graphics.Bitmap in my Java project without Android platform. The problem is that I want to process the picture which comes to the server from Android device as a Bitmap.
So, I need some methods that use native code. How can I get these native methods? I tried to find them at http://androidxref.com but if I have, for example,
static int Bitmap_width(JNIEnv* env, jobject, SkBitmap* bitmap) {return bitmap->width();}
clicking width() refers to the new search where there are lots of classes contating different width().

You shouldn't do that. The android libraries are not meant to be used for non android applications. Especially the native code.
As long as you send a bitmap image, any non-android library will be able to read it.
You can event send the image raw pixels. It's like sending a list of bytes. Any language will be able to read incomming bytes.

I managed to do this. Now the server gets only width, height of the image and array of pixels derived by bitmap.getPixels(...) on the device and builds the picture as described here

Related

Play 'unsupported' codec in VideoView

I've a stream with contains a audio/video stream. The video encoding is H.264 AVC and the audio encoding is G.711 ยต-law (PCMU). (I have no control over the output format.)
When I try to display the video into a VideoView frame, the device says that it cannot play the video. I guess that's because Android doesn't support the aforementioned audio encoding.
Is there a way to somehow display the selected stream in the VideoView?
Can I just use a Java code snippet to 'compute' the codec? I've seen the class android.net.rtp.AudioCodec, which contains a field PCMU, and I can imagine that class would be useful.
Or do I have to insert a library or some native code (FFmpeg) into my application and use that? If yes, how?
How to fix it?
If you want to manipulate a stream before you display it, you are unfortunately getting into tricky territory on an Android device - if you do have any possibility of doing the conversion on the serve side it will likely by much easier (and you should also usually have more 'horsepower' on the server side).
Assuming that you do need to convert on the server side, then a technique you can use is to stream the video from the server to your app, convert it as needed and then 'stream' from a localhost server in your app to a VideoView in your app. Roughly the steps are:
Stream the encrypted file from the server as usual, 'chunk by chunk'
On your Android device, read from the stream and convert each chunk as it is received
Using a localhost http server on your Android device, now 'serve' the converted chunks to the MediaPlayer (the media player should be set up to use a URL pointing at your localhost http server)
An example of this approach, I believe, is LibMedia: sure: http://libeasy.alwaysdata.net (note, this is not a free library, AFAIK).
For the actual conversion you can use ffmpeg as you suggest - there are a number of ffmpeg wrapper for Android projects that you can either use or look at to design your own. Some examples:
http://hiteshsondhi88.github.io/ffmpeg-android-java/
https://github.com/jhotovy/android-ffmpeg
Take a look at the note about libffmpeginvoke in the second link in particular if you are planning to write your won wrapper.
One thing to be aware of is that compressions techniques, in particular video compression, often use an approach where one packet or frame is compressed relative to the frames before it (and sometimes even the frames after it). For example the first frame might be a 'key frame', the next five frames might just contain data to explain how they differ from the key frame, and the the seventh frame might be another key frame and so on. This means you generally want a 'chunk' which has all the required key frames if you are converting the chunk from one format to another.
I don't think you will have this problem converting from PCM to AAC audio, but it useful to be aware of the concept anyway.

UIImage, imageWithData in Android?

I'm fairly new in Android development so please forgive me if it's a stupid question.
Basically I have an iPhone app,which will compress UIImage into NSData, using UIImageJPEGRepresentation method, this is not a problem.
The question begins when I am trying to read this data on an Android device. I'm transferring the image data (NSData)from iPhone to Android, and I want to display the image on Android. How should I do so?
Is there any customised class that can do function similar to UIImage imageWithData in Java, that can decode the data and turn it into Bitmap?
Or should I try to encode and decode using a different but stand method? If so, what are the options?
Thank you so much for your help!
After further digging and researching, I found some interesting and useful classes if anyone is interested.
https://code.google.com/p/crossmobile/source/browse/src/xmioslayer/src/org/xmlvm/iphone/?r=68eaa3b2fa6ee05e8034bd517b45b490c50c7bb7
This is google cross mobile code, with them, you can use NSData and other class method that was originally meant for iOS developing.
However that's not how I solve the problem above.
There're two ways to pass an image from iPhone(UIImage) to Android(Bitmap), or the other way around.
First, upload the image into your own server, (by using AFNetwork in iOS, not sure about android), then get the url of the image, then pass this url to the other device as a string.
Or, convert UIImage to NSData, then from NSData to NSString (using Base64Encoding), then use NSJsonSerializatioin to parse whatever NSString, NSArray or NSDictionary to Json Object. Send the Json object, and try to decode it on the other device.
Although the first method is recommended, I actually went with the second one since I wish to deal the data locally with or without internet.
If anyone needs sample code, just ask, I'll post.
And that's pretty much it! Happy coding!

Create an EGLContext in C++ and bind it to a Java GLSurfaceView

i try to create an EGLContext in C++ and bint it to a GLSurfaceView.
After some research on google and even here, i don't see something close to my problem.
It is possible to do this ?
I already know that i can do a NativeActivity, but i also need to use Java library for Loading Image, Audio, make HttpRequest, get Device information, etc.
Any help is welcome.

Blackberry radio app streaming audio

I am building a radio app in BB 5. I have a .pls url where I find my urls to play the stream. My issue is. I need to build a Buffer to play this stream because the file which is downloaded is too big to play it inmediatly, but I don't know how to build this buffer. Any idea?
I think that it must be something similar to that
Streaming media BB
But I want something more simple, only play and stop the radio streaming.
Ok I've solved this, using the package of streaming from the code mentioned in the link above. I've added the CircularByteBuffer from the small link, showed in article. There is a class in the straming package which needs some fixes (in some BufferOverflowException and the resize method from CircularByteBuffer). And now my project is working! Great!
ADD
The fixes were in the StreamingPlayer class, there is a call to method resize with some int parameter. In the CircularByteBuffer code, the method is resize(), so doesn't need int parameters, it doubles the buffer capicity. So I change that call, using the resize() without int parameters.
The other fix is about BufferOverflowException. In the code this object has a String, but it gives an error. I delete this Strings.

Can I create Bitmaps with the Android NDK in C++ and pass them to Java really fast?

I need to create Bitmaps in C++ with the ndk and pass them to Java. I don't mean copy but actually pass them as a parameter or return value or anything else like that because copying them would be far too slow. I also really need to create them in the NDK part and not in java.
Does anyone know a way to do that ?
As Peter Lawrey already pointed out using a non-Java object is not possible however it may be possible to directly paint the raw data from a simple Java byte array (which can be directly accessed on the C++ side).
In the end you could even call Canvas.drawBitmap(..) using this byte array for painting your created image. Of course that requires to store the image on C++ side directly in the required format inside the byte array.
Java:
byte[] pixelBuf = new byte[bufSize];
Then pass the byte[] reference to the native C++ implementation. On C++ side you can call
C++:
jbyte* pixelBufPtr = env->GetByteArrayElements(pixelBuf, JNI_FALSE);
jsize pixelBufLen = env->GetArrayLength(env, pixelBuf);
// modify the data at pixelBufPtr with C++ functions
env->ReleaseByteArrayElements(pixelBuf, pixelBufPtr, 0);
You can try using a direct ByteBuffer. A ByteBuffer referes to a native area of memory. However to use the Image in Java it has to be in a Java object. You can assemble this object in Java or C, but I don't see how you can avoid copying the image unless your C library writes the image as the Java structure.
I had the same issue and my solution was to allocate the huge buffer in C++ and then draw each requested "view" into a Java bitmap. I use the "new" NDK bitmap functions which showed up in Android 2.2; they allow direct access to the bitmap bits. It may not be ideal for your use, but it does avoid copying bitmaps and allow you to create bitmaps as large as free memory. I create a bitmap the size of the display window and then draw scaled views from the large bitmap into it with my own native code scaler.

Categories

Resources