Screenshot: http://imgur.com/r7EpSpx
I have a very specific issue regarding the operation of a SurfaceView and a Camera. I am using the SurfaceView as the preview surface for a camera in an IntentService (to allow background operation).
After a lot of trial and error, I almost have the code working as desired. However, when I am using MediaRecorder to record video, whenever I resume my activity (thus the SurfaceView is re-created) the video is extremely distorted/artifacted in what seems to be rainbow colors with a large black rectangle in the middle. Also the video effect seems to be in a "tile" arrangement. If you look closely, the camera is still working and the video records as normal, but the SurfaceView preview is broken.
Normally, I would re-instantiate the Camera object in the surfaceCreated methods of my SurfaceView callback, but whenever I use Camera.stopPreview(), or several other Camera functions, it causes my MediaRecorder.stop() to hang indefinitely (this is another separate issue). Due to this, I must use the same Camera object in my IntentService when the surface is re-created.
Everything works as normal except this strange video distortion, even the resulting video that the MediaRecorder produces. Only the preview is affected. I am unable to determine whether this is a code issue, software issue, TouchWiz issue, or hardware issue. It occurs in all orientation configurations. Moving the code to surfaceChanged results in the same thing. Thanks in advance for any help or insight into this!
#Override
public void surfaceCreated(SurfaceHolder holder) {
if (isRecording) //Only run this code if MediaRecorder is recording
try {
recordingCamera.setPreviewDisplay(rHolder); //This works, but causes the aforementioned distortion
//recordingCamera.startPreview(); //Removes distortion, but causes MediaRecorder.stop() to freeze app indefinitely
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
//recordingCamera.stopPreview(); //Removes distortion, but causes MediaRecorder.stop() to freeze app indefinitely
}
Running this on a Galaxy Note II | Android 4.1.2 (TouchWiz)
Related
I have a little Android game I've been working on and have also implemented some pixel perfect collision checking based on code here (I say based on, the only thing I changed was the class name). This used to work fine a few months ago but I stopped development for a while, came back and had to rebuild a lot of code including that. Now when I implement it to collisions seem horribly off. If the large weight is on the left of the player, there's a significant distance and yet it still counts as a collision. I have also noticed that the player can pass through some of the smaller weights but not all the way through. I know the player has some non-transparent white pixels which don't help, but I tried with a transparent smaller image and the same thing happened. I edited the player image to remove the white pixels, but I'm still getting the odd collision detection. Any ideas?
The second issue, when the player collides with a weight, it goes to the next activity (which it does) however it jerks all the objects on screen for about a second, and then proceeds to the next activity. Any ideas on this? I have a feeling it's to do with the way I handle the threads, or the way I close them. Check GitHub link below for the GameThread class which handles the main operations on my surface thread.
Collision code can be found in above link, my calling of the class can be found below:
// handles game over circumstances
if (CollisionUtil.isCollisionDetected(player.getBitmap(), player.getX(), player.getY(), weight.getBitmap(), weight.getX(), weight.getY())) {
Intent gameOverIntent = new Intent(this.getContext(), GameOverActivity.class);
player.setTouched(false);
this.getContext().startActivity(gameOverIntent);
((Activity) this.getContext()).finish();
save(score, time);
try {
gameTimers.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
gameOver = true;
}
The rest of the classes and all my other code can be found here: https://github.com/addrum/Francois/tree/master/src/com
Thanks for your help in advance.
Edit: Forgot to mention that if you need to test it (you probably will to see what I'm talking about) you can download it here: https://play.google.com/store/apps/details?id=com.main.francois
Edit: I've now uploaded a short recording on YouTube to make it easier to see what I'm talking about: http://youtu.be/vCjKmTmhabY
Thanks for the video; now I understand the "jerk" part.
The explanation for that is simple: your code is doing this (simplified):
while (running) {
canvas = this.surfaceHolder.lockCanvas();
this.gameLogic.update();
this.gameLogic.render(canvas);
surfaceHolder.unlockCanvasAndPost(canvas);
}
public void render(Canvas canvas) {
if (!gameOver) {
...draw...
}
}
If gameOver is true, your render() function doesn't draw anything, so you get the previous contents of the buffer.
You might expect this to show whatever was drawn from the frame just before this one. However, the SurfaceView surface is double- or triple-buffered. So if you just keep calling lock/post without drawing anything, you'll cycle through the last two or three frames.
Simple rule: if you lock the surface, you need to draw. If you don't want to draw when the game is over, you need to put your "game over" test back before you lock the canvas.
I need to be able to display two identical previews from a running camera instance on screen, is this possible?
So far the only way I could think of was to clone the preview surfaceview. So using the raw video data provided to onPreviewFrame,
I am creating a new bitmap for the current frame and then drawing that to the surfaceholder canvas.
This displayed perfectly when run on the ui thread, but obviously it blocked the UI every couple of seconds whilst creating the bitmaps. I have now moved the bitmap creation into a new thread which resolves the app locking up, but now my surfaceview flickers/tears slightly!
Would this ever work? or am I going the wrong way about it? How can I stop the flicker?
Cut down sample code;
public void onPreviewFrame(byte[] data, Camera camera) {
-- new thread
// create bitmap from YUV (videoFrame)
-- UI thread
Canvas canvas = null;
try {
canvas = surfaceHolder.lockCanvas();
if (canvas != null) {
canvas.drawBitmap(videoFrame, null, new Rect(0, 0, 200, 200), null);
}
} catch (Exception e) {
mLogger.error(e.getMessage());
} finally {
if (canvas != null) {
surfaceHolder.unlockCanvasAndPost(canvas);
}
}
}
I believe this is something to do with the surfaceview double buffer, I have quite a few solutions but cant stop the flicker!
If you don't need to run on Android releases before 3.0 (Honeycomb), you can use the SurfaceTexture output option, and OpenGL ES for rendering multiple previews.
Setting up a working OpenGL view is a bit involved, so you'll want to find a guide for that.
Once you have that working, create a SurfaceTexture object with a OpenGL texture ID you want to use for the preview texture, pass that into the camera with setPreviewTexture instead of using setPreviewDisplay, and then in your OpenGL thread, call updateTexImage to update the texture to show the latest frame from the camera (best to do so in response to a onFrameAvailable callback for best performance).
Then you need OpenGL code to render two rectangles for your two copies of preview, and a shader to draw the preview texture on both.
This is a decent bit of setup, but in the end, you'll have a very flexible system - then you can animate the preview in various fun ways, etc.
The app I'm writing requires camera functionality.
So to learn about how to operate the camera, I followed this script:
http://developer.android.com/resources/samples/ApiDemos/src/com/example/android/apis/graphics/CameraPreview.html
I have put the activity in my manifest, set the screen orientation for it on landscape mode.
The problem I'm having is, when the camera is held sideways (so I hold my Galaxy Tab P1000 in landscape position) the view is stretched out.
To be more specific about my script, I used an exact copy of code that Google made. It can be found in the android-sdk\samples\android-8\ApiDemos\src\com\example\android\apis\graphics\
The file itself is called CameraPreview.
I really have no clue why the screen looks so stretched. Of course, the format is weird and not square, but still, when using the default camera app installed on the device, it doesn't deform at all. This camera deforms the image when I hold it sideways and move the camera even a little.
What I did was: I held my galaxy tab to take a picture of an object (laptop in this case) then took a picture with my phone of my Galaxy. On the Galaxy I have the camera screen open in the app i'm making. This counts for both images. One I hold sideways and one I hold in portrait view. The pics are a bit unclear but you can see that in the landscape picture, the camera has become massively wide.
I faced the same problem yesterday. After a researching "Camera" sources I found a reason for camera preview being stretched.
The reason is: SurfaceView aspect ratio (width/height) MUST be same as Camera.Size aspect ratio used in preview parameters. And if aspect ratio is not the same, you've got stretched image.
So, the fastest workaround is to set SurfaceView to size like 320px x 240px - smallest supported size from Parameters.getSupportedPreviewSizes().
Also, you can look at Camera standard application sources, it uses the custom layout for controlling the SurfaceView size (see PreviewFrameLayout.java, onMeasure() function).
Use
git clone https://android.googlesource.com/platform/packages/apps/Camera.git
to get Camera sources.
public void surfaceCreated(SurfaceHolder holder) {
// TODO Auto-generated method stub
try {
camera = Camera.open();
camera.setDisplayOrientation(90);
camera.setPreviewDisplay(holder);
Camera.Parameters parameters = camera.getParameters();
List<Size> sizes = parameters.getSupportedPictureSizes();
parameters.setPictureSize(sizes.get(0).width, sizes.get(0).height); // mac dinh solution 0
parameters.set("orientation","portrait");
//parameters.setPreviewSize(viewWidth, viewHeight);
List<Size> size = parameters.getSupportedPreviewSizes();
parameters.setPreviewSize(size.get(0).width, size.get(0).height);
camera.setParameters(parameters);
camera.startPreview();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
You need only getSupportedPreviewSizes() and save it to a List:
List<Size> size = parameters.getSupportedPreviewSizes();
parameters.setPreviewSize(size.get(0).width, size.get(0).height);
camera.setParameters(parameters);
camera.startPreview();
I hope this helps you.
How do i detect when my textures are destroyed on android?
My Renderer class for my GLSurfaceView currently looks like this:
public void onDrawFrame(GL10 gl)
{
nativeLibrary.drawFrame();
}
public void onSurfaceChanged(GL10 gl, int width, int height)
{
if (reload)
{
library.glRecreate(); //this method reloads destroyed textures
}
else
{
nativeLibrary.init(width, height)); //this method initializes my game
reload = true;
}
}
public void onSurfaceCreated(GL10 gl, EGLConfig config)
{
}
The problem is that doesn't always work. When i press the home button from my game and then start it again, it works like a charm. But when i lock the device, and then unlock it again, all textures are just black. Everything seems to reset when I lock it too (my game always comes back in the main menu). When I quit the game using the home button, and do a lock/unlock after that, the game doesn't reset.
When doing OpenGL on Android, I highly recommend that you watch these two Google I/O talks by Chris Pruett, Android advocate, who wrote the open-source game Replica Island.
Here he talks about the exact problem you're seeing. Long story short: you don't detect when your textures (and buffers) are destroyed, but you detect when they need to be recreated. And this is exactly what the onSurfaceCreated callback is for:
Since this method is called at the beginning of rendering, as well as every time the EGL context is lost, this method is a convenient place to put code to create resources that need to be created when the rendering starts, and that need to be recreated when the EGL context is lost. Textures are an example of a resource that you might want to create here.
For some reason the camera feed in my program is sideways. If I hold my finger up to the camera from the bottom, it shows up as coming from the right in the surfaceview.
I have a pretty standard implementation
surfaceView = (SurfaceView)findViewById(R.id.surfaceView);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(surfaceCallback);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
I've looked around but I can't seem to find any information on this. My device is a nexus One.
EDIT: If I set the screen orientation to landscape then it works fine for some reason... Can't get it in portrait though.
The orientation of the preview display image is controlled by Camera.setDisplayOrientation(int) method. For my Evo, setting the orientation to 90 makes the preview display correct for portrait mode.
...
camera.setParameters(parameters);
camera.setDisplayOrientation((orientation+90)%360);
camera.startPreview();
...
It seems that changing the orientation after the preview is started causes an exception. If you change rotate the phone/device, the SurfaceHolder.surfaceChanged callback will be called again, if you do the setDisplayOrientation and startPreview in that callback, you get an exception. I'm, going to try stopping preview, then changing the orientation then starting it again.
Note: There is also the Camera.Parameters.setRotation(int) method, that (I think) changes the orientation of the image when it is written into the (JPEG) output image when a picture is taken. This method does not change the preview display orientation.
use this line
mCamera.setDisplayOrientation(90);
in surfaceCreated method after
mCarmera.open();