I need to be able to display two identical previews from a running camera instance on screen, is this possible?
So far the only way I could think of was to clone the preview surfaceview. So using the raw video data provided to onPreviewFrame,
I am creating a new bitmap for the current frame and then drawing that to the surfaceholder canvas.
This displayed perfectly when run on the ui thread, but obviously it blocked the UI every couple of seconds whilst creating the bitmaps. I have now moved the bitmap creation into a new thread which resolves the app locking up, but now my surfaceview flickers/tears slightly!
Would this ever work? or am I going the wrong way about it? How can I stop the flicker?
Cut down sample code;
public void onPreviewFrame(byte[] data, Camera camera) {
-- new thread
// create bitmap from YUV (videoFrame)
-- UI thread
Canvas canvas = null;
try {
canvas = surfaceHolder.lockCanvas();
if (canvas != null) {
canvas.drawBitmap(videoFrame, null, new Rect(0, 0, 200, 200), null);
}
} catch (Exception e) {
mLogger.error(e.getMessage());
} finally {
if (canvas != null) {
surfaceHolder.unlockCanvasAndPost(canvas);
}
}
}
I believe this is something to do with the surfaceview double buffer, I have quite a few solutions but cant stop the flicker!
If you don't need to run on Android releases before 3.0 (Honeycomb), you can use the SurfaceTexture output option, and OpenGL ES for rendering multiple previews.
Setting up a working OpenGL view is a bit involved, so you'll want to find a guide for that.
Once you have that working, create a SurfaceTexture object with a OpenGL texture ID you want to use for the preview texture, pass that into the camera with setPreviewTexture instead of using setPreviewDisplay, and then in your OpenGL thread, call updateTexImage to update the texture to show the latest frame from the camera (best to do so in response to a onFrameAvailable callback for best performance).
Then you need OpenGL code to render two rectangles for your two copies of preview, and a shader to draw the preview texture on both.
This is a decent bit of setup, but in the end, you'll have a very flexible system - then you can animate the preview in various fun ways, etc.
Related
In short - I have an OpenGL texture which I need to copy into a FBO or another texture in the most efficient way possible under OpenGL ES 2.0.
At the moment I'm simply drawing the first texture into an FBO.
Is there a more efficient way? E.g. some sort of pixel copy that doesn't involve pulling the pixel data back to RAM and then buffering over to another texture - in ES 2.0 profile.
To give you context - this is video rendering related:
I have an OpenGL texture being buffered with video frame data from LibVLC, but due to timing issues between my app redraw rate and the decoder refresh rate, its possible that the OpenGL texture has no new frame to update & very oddly if I were to draw the texture to the screen without a frame to update - instead of it containing the previous frame's data, it draws some weird image of its own ( it looks like a space invader ?? ) - hence why I need to 'cache' the previous frame's content.
Specifically, this is an OpenGL texture, used to create an Android SurfaceTexture, which I have to call updateTexImage and releaseTexImage manually to either update the Texture's content and then to mark that I'm done with the SurfaceTexture's content once I've drawn it. If I don't release it then no new frames can be updated to it. If I release it I can't redraw the same frame more than once. Which has lead me to caching the texture once it's updated so the decoder is free to write to it & I still have the previous frame if when I come to draw again & there hasn't been an update yet.
Thanks.
This solution is based on having worked out how to properly interface with the Android SurfaceTexture to avoid having to cache its contents altogether.
If anyone wants to share an answer regarding efficient texture copies in OpenGL ES 2.0 please feel free - that would befit my original question.
For anyone interested, this is what I did:
My GLPaint class for rendering video on Android contains the SurfaceTexture instance, it's created from a JOGL OpenGL Texture ( using it's ID ) - This enables me to draw the texture as part of my GLPaint.paint function.
This GLPaint implements the onFrameAvailable method of the SurfaceTexture.OnFrameAvailableListener interface. I have a private boolean to flag when the first frame has been 'written' - this is needed to avoid releasing the SurfaceTexture prematurely.
The class GUIThreadLock is my extension of a ReentrantLock - I extended it for additional debugging info and to make a static singleton. This lock in my 'one in - one out' controller for threads in my UI, ensuring only a single thread is active in my UI at a time ( usually just the main renderer looper ) but also handles getting the OpenGL context bound or unbound to the lock owner thread whenever one acquires and relinquishes the lock respectively.
#Override
public void onFrameAvailable( SurfaceTexture pSurfaceTexture )
{
try
{
GUIThreadLock.acquire();
if( written )
{
pSurfaceTexture.releaseTexImage();
}
pSurfaceTexture.updateTexImage();
written = true;
}
finally
{
GUIThreadLock.relinquish();
}
}
This means that when a new frame is ready, the SurfaceTexture releases the old one & updates to the new one - it is then buffered with the current frame for any point I wish to draw the video until the next frame is ready. The GUIThreadLock protects the SurfaceTexture from updating the content while I may be drawing it but also acquires my GL Context so that the callback thread can actually perform the update here.
My Redraw thread would also require the GUIThreadLock before it could draw my app - thus the GL context is protected and the relationship between frame updates and frame draws cemented.
I've got a simple Swing/AWT application that runs in full screen mode on Windows. I have a couple of different PNG files that it loads as its own background image depending on context.
It loads them like this:
BufferedImage bufferedImage;
bufferedImage = ImageIO.read(getClass().getResource("/bg1.png"));
Image bgImage1 = bufferedImage.getScaledInstance(width, height, Image.SCALE_SMOOTH);
bufferedImage = ImageIO.read(getClass().getResource("/bg2.png"));
Image bgImage2 = bufferedImage.getScaledInstance(width, height, Image.SCALE_SMOOTH);
bufferedImage = ImageIO.read(getClass().getResource("/bg3.png"));
Image bgImage3 = bufferedImage.getScaledInstance(width, height, Image.SCALE_SMOOTH);
And later draws them like this:
window.repaint();
graphics.drawImage(bgImage1, 0, 0, null);
// draw some other stuff too, like text
And just to be thorough, window is a JWindow variable, and graphics is a Graphics2D variable.
The problem I'm running into happens when I switch out one of the background images for another. The first time I do the switch, calling something like this:
window.repaint();
graphics.drawImage(bgImage2, 0, 0, null);
// draw some other stuff too, like text
...the entire screen goes white for about a second. And then it does successfully show the image, but that flicker is really annoying. My guess is that because the images are relatively large and high resolution (2560x1440), it needs about a second to load them and scale them to the appropriate size.
How can I get it to load those images silently? As in... how do I avoid drawing a blank white screen for a second, that first time it displays a new background image? All subsequent times are already instantaneous, probably because it's truly grabbed them into memory at that point. But simply calling getScaledInstance apparently isn't enough to put things into memory, because it doesn't actually flicker until I call drawImage down the line.
ImageIcon will load in the background as a feature.
You can also accomplish this fairly easily with a background thread e.g.:
final String path = "/example.png";
new SwingWorker<BufferedImage, Void>() {
#Override
public BufferedImage doInBackground() throws IOException {
return ImageIO.read(ClassName.class.getResource(path));
}
#Override
public void done() {
try {
BufferedImage img = get();
// put img somewhere
} catch(InterruptedException ignored) {
} catch(ExecutionException ex) {
ex.printStackTrace(System.err);
}
}
}.execute();
Also,
window.repaint();
graphics.drawImage(bgImage2, 0, 0, null);
This worries me a little bit. In general we do not need to ask for a repaint of an entire top-level container. You should also not be using getGraphics() to paint.
Non-top-level Swing components are double-buffered but if you are painting outside of the paint structure you do not get this. It will result in flickering.
Two good sources for correct custom painting are:
Lesson: Performing Custom Painting
Painting in AWT and Swing
Painting should be done passively by overriding paintComponent on a JComponent. A JLabel can also display an ImageIcon as a feature so you do not necessarily have to do custom painting if you just want to display an image.
I notice that the scaling takes a lot of time, not the loading of the images.
Because of this, I usually store a scaled image so I can use it later on so I only have to scale once.
Screenshot: http://imgur.com/r7EpSpx
I have a very specific issue regarding the operation of a SurfaceView and a Camera. I am using the SurfaceView as the preview surface for a camera in an IntentService (to allow background operation).
After a lot of trial and error, I almost have the code working as desired. However, when I am using MediaRecorder to record video, whenever I resume my activity (thus the SurfaceView is re-created) the video is extremely distorted/artifacted in what seems to be rainbow colors with a large black rectangle in the middle. Also the video effect seems to be in a "tile" arrangement. If you look closely, the camera is still working and the video records as normal, but the SurfaceView preview is broken.
Normally, I would re-instantiate the Camera object in the surfaceCreated methods of my SurfaceView callback, but whenever I use Camera.stopPreview(), or several other Camera functions, it causes my MediaRecorder.stop() to hang indefinitely (this is another separate issue). Due to this, I must use the same Camera object in my IntentService when the surface is re-created.
Everything works as normal except this strange video distortion, even the resulting video that the MediaRecorder produces. Only the preview is affected. I am unable to determine whether this is a code issue, software issue, TouchWiz issue, or hardware issue. It occurs in all orientation configurations. Moving the code to surfaceChanged results in the same thing. Thanks in advance for any help or insight into this!
#Override
public void surfaceCreated(SurfaceHolder holder) {
if (isRecording) //Only run this code if MediaRecorder is recording
try {
recordingCamera.setPreviewDisplay(rHolder); //This works, but causes the aforementioned distortion
//recordingCamera.startPreview(); //Removes distortion, but causes MediaRecorder.stop() to freeze app indefinitely
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
//recordingCamera.stopPreview(); //Removes distortion, but causes MediaRecorder.stop() to freeze app indefinitely
}
Running this on a Galaxy Note II | Android 4.1.2 (TouchWiz)
The app I'm writing requires camera functionality.
So to learn about how to operate the camera, I followed this script:
http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/CameraPreview.html
I have put the activity in my manifest, set the screen orientation for it on landscape mode.
The problem I'm having is, when the camera is held sideways (so I hold my Galaxy Tab P1000 in landscape position) the view is stretched out.
To be more specific about my script, I used an exact copy of code that Google made. It can be found in the android-sdk\samples\android-8\ApiDemos\src\com\example\android\apis\graphics\
The file itself is called CameraPreview.
I really have no clue why the screen looks so stretched. Of course, the format is weird and not square, but still, when using the default camera app installed on the device, it doesn't deform at all. This camera deforms the image when I hold it sideways and move the camera even a little.
What I did was: I held my galaxy tab to take a picture of an object (laptop in this case) then took a picture with my phone of my Galaxy. On the Galaxy I have the camera screen open in the app i'm making. This counts for both images. One I hold sideways and one I hold in portrait view. The pics are a bit unclear but you can see that in the landscape picture, the camera has become massively wide.
I faced the same problem yesterday. After a researching "Camera" sources I found a reason for camera preview being stretched.
The reason is: SurfaceView aspect ratio (width/height) MUST be same as Camera.Size aspect ratio used in preview parameters. And if aspect ratio is not the same, you've got stretched image.
So, the fastest workaround is to set SurfaceView to size like 320px x 240px - smallest supported size from Parameters.getSupportedPreviewSizes().
Also, you can look at Camera standard application sources, it uses the custom layout for controlling the SurfaceView size (see PreviewFrameLayout.java, onMeasure() function).
Use
git clone https://android.googlesource.com/platform/packages/apps/Camera.git
to get Camera sources.
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
try {
camera = Camera.open();
camera.setDisplayOrientation(90);
camera.setPreviewDisplay(holder);
Camera.Parameters parameters = camera.getParameters();
List<Size> sizes = parameters.getSupportedPictureSizes();
parameters.setPictureSize(sizes.get(0).width, sizes.get(0).height); // mac dinh solution 0
parameters.set("orientation","portrait");
//parameters.setPreviewSize(viewWidth, viewHeight);
List<Size> size = parameters.getSupportedPreviewSizes();
parameters.setPreviewSize(size.get(0).width, size.get(0).height);
camera.setParameters(parameters);
camera.startPreview();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
You need only getSupportedPreviewSizes() and save it to a List:
List<Size> size = parameters.getSupportedPreviewSizes();
parameters.setPreviewSize(size.get(0).width, size.get(0).height);
camera.setParameters(parameters);
camera.startPreview();
I hope this helps you.
I'm implementing a fairly standard app with the Android sdk that involves drawing using the SurfaceView, SurfaceHolder, Callback setup.
In my main thread (UI thread) I have no drawing or handling of the SurfaceHolder (or the canvas you retrieve with it).
In a separate thread I have the following:
Log.i("GAME.DrawThread", "run()");
Log.i("GAME.DrawThread", Thread.currentThread().getName());
Canvas canvas = null;
try {
canvas = holder.lockCanvas();
synchronized(holder) {
Log.i("GAME", "draw():synchronized");
Paint paint = new Paint();
paint.setColor(R.color.draw_color);
canvas.drawColor(R.color.draw_color);
canvas.drawLine(0, 0, 500, 500, paint);
}
} catch (SurfaceHolder.BadSurfaceTypeException e) {
Log.e("GAME", "onDraw(): BadSurfaceTypeException");
} finally {
if (canvas != null) {
holder.unlockCanvasAndPost(canvas);
}
}
This code is being executed, throws no exceptions, and has no negative side effects that I can find; however, the unlockCanvasAndPost() call never causes onDraw() to be called.
In other words, unlockCanvasAndPost() does not cause a redraw of the SurfaceView.
Any ideas what could cause this symptom? I have plenty of java experience, a fair amount of android experience, and a lot of debugging experience and cannot track this one down.
Thanks in advance.
So it turns out that when using SurfaceView you draw to a Surface that is underneath a Window. I was setting the background color of the View in xml; it turns out that sets the background color of the Window, not the Surface. In effect, I made the Window opaque so that you couldn't see the Surface underneath.
Lesson Learned.
That's not how SurfaceView works. Calling unlockCanvasAndPost() does not invoke onDraw(), that's the whole point of using a SurfaceView. A SurfaceView's surface lives in a different window.
This is old, but I have a feeling the problem is there is no surfaceholder callback. He was trying to draw on the surface before the surface was created
For set SurfaceView to top Z-View index, add to youre CustomerSurfaceView a code below:
init {
setZOrderOnTop(true)
}