I am trying to record video using media recorder and camera2 but the app crashes as soon as mediarecorder.start() function is encountered. In the oncreate first prepareCamera is called and then trigger is called. I am a bit new to camera2. Can anyone help me find out why it is happening so?
public void prepareCamera() throws CameraAccessException {
manager = (CameraManager) getSystemService(CAMERA_SERVICE);
String[] cameras = manager.getCameraIdList();
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
Log.v("mycontroller","permission not granted");
return;
}
Log.v("mycontroller","permission granted "+cameras[0]);
manager.openCamera(cameras[0], new CameraDevice.StateCallback(){
#Override
public void onOpened(CameraDevice camera) {
Log.v("mycontroller","camera opened");
mCamera2 = camera;
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);
mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
try {
mediaRecorder.setOutputFile(createFile().getAbsolutePath());
mediaRecorder.prepare();
Log.v("mycontroller","recorder prepared");
List<Surface> list = new ArrayList<>();
list.add(mediaRecorder.getSurface());
final CaptureRequest.Builder captureRequest = mCamera2.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
captureRequest.addTarget(mediaRecorder.getSurface());
mCaptureRequest = captureRequest.build();
mCamera2.createCaptureSession(list, new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(CameraCaptureSession session) {
mSession = session;
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
mSession = session;
}
}, null);
} catch (Exception e) {
e.printStackTrace();
}
}
#Override
public void onDisconnected(CameraDevice camera) {}
#Override
public void onError(CameraDevice camera, int error) {}
}, null);
}
public void trigger() {
try {
mediaRecorder.start();
mSession.setRepeatingRequest(mCaptureRequest,
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureStarted(CameraCaptureSession session, CaptureRequest request, long timestamp, long frameNumber) {
Log.v("mycontroller","camera started capturing");
super.onCaptureStarted(session, request, timestamp, frameNumber);
}
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
Log.v("mycontroller","camera stoped capturing");
super.onCaptureCompleted(session, request, result);
}
}, null);
} catch (CameraAccessException e) {
Log.v("mycontroller",e.getMessage());
e.printStackTrace();
}
}
private void releaseMediaRecorder() throws CameraAccessException {
mSession.stopRepeating();
try {
mediaRecorder.stop();
mediaRecorder.reset();
mediaRecorder.release();
}
catch (Exception e){}
mediaRecorder= null;
mCamera2=null;
}
Take a look at Google's Camera2Video sample to see if you can find any critical differences between your code and the sample:
https://github.com/googlearchive/android-Camera2Video/tree/master/Application/src/main/java/com/example/android/camera2video
You may need to set more MediaRecorder settings like resolution; often the CamcorderProfile class is used to do this easily.
Related
I am currently trying to implement computer vision algorithms on my android device. My idea is to have a camera preview running and draw on top of the preview (in a separate view)
edges and other geometric figures. To command the camera, I use the camera2 API.
I start a capture session with two requests with the following targets:
Request number 1: target/surface -> an ImageReader for image processing
Request number 2: target/surface -> a SurfaceView for camera preview
I burst both requests with setRepeatingBurst. Now a problem occurs when I process the image in the
ImageReader callback (onImageAvailable). When I process for let's say 500 ms then the image preview drops to 1 fps. I think the problem is because the image.close() gets delayed and that the second request is somehow connected to that image.
I thought that if I have two requests then there are sent two individual images to each surface. But apparently that's not the case. Well, I could make a quick copy in the ImageReader callback and close the image afterwards. And then start processing the copy. But the thing is that I don't
want to copy.
Here is the code:
HandlerThread imageProcessingThread = new HandlerThread("imageProcessingThread");
imageProcessingThread.start();
Handler imageProcessingHandler = new Handler(imageProcessingThread.getLooper());
ImageReader imageReader = ImageReader.newInstance(width, height, format, 1);
imageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireNextImage();
if(image == null) return;
// simulating processing
try {
Thread.sleep(500);
} catch (InterruptedException e) {
e.printStackTrace();
}
// processing end
image.close();
}
}, imageProcessingHandler);
try {
HandlerThread imageProcessingThread3 = new HandlerThread("imageProcessingThread3");
imageProcessingThread3.start();
Handler imageProcessingHandler3 = new Handler(imageProcessingThread3.getLooper());
camManager.openCamera(cam.second, new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice cameraDevice) {
try {
List<Surface> surfaces = new ArrayList<>();
surfaces.add(binding.svPreview.getHolder().getSurface());
surfaces.add(imageReader.getSurface());
cameraDevice.createCaptureSession(surfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession session) {
try {
CaptureRequest.Builder requestBuilder1 = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
requestBuilder1.addTarget(imageReader.getSurface());
requestBuilder1.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, new Range<>(30, 30));
CaptureRequest.Builder requestBuilder2 = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
requestBuilder2.addTarget(binding.svPreview.getHolder().getSurface());
requestBuilder2.set(CaptureRequest.CONTROL_AE_TARGET_FPS_RANGE, new Range<>(30, 30));
HandlerThread imageProcessingThread2 = new HandlerThread("imageProcessingThread2");
imageProcessingThread2.start();
Handler imageProcessingHandler2 = new Handler(imageProcessingThread2.getLooper());
List<CaptureRequest> requests = new ArrayList<>();
requests.add(requestBuilder1.build());
requests.add(requestBuilder2.build());
session.setRepeatingBurst(requests, new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureProgressed(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull CaptureResult partialResult) {
}
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
}
}, imageProcessingHandler2);
} catch (Exception ex) {
fatalError(ex.getMessage());
return;
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
fatalError("onConfigureFailed");
return;
}
}, null);
} catch (Exception ex) {
fatalError(ex.getMessage());
return;
}
}
#Override
public void onDisconnected(#NonNull CameraDevice cameraDevice) {
}
#Override
public void onError(#NonNull CameraDevice cameraDevice, int i) {
}
}, imageProcessingHandler3);
} catch (Exception ex) {
fatalError(ex.getMessage());
return;
}
I'm trying to configure a camera compatibility without preview but after taking second photo the app crashes with exception:
2018-12-27 14:36:20.392 12389-12977/com.example.android.braillefeeder E/RequestThread-0: Received device exception during capture call:
java.io.IOException: setPreviewTexture failed
at android.hardware.Camera.setPreviewTexture(Native Method)
at android.hardware.camera2.legacy.RequestThreadManager.doJpegCapturePrepare(RequestThreadManager.java:298)
at android.hardware.camera2.legacy.RequestThreadManager.-wrap1(Unknown Source:0)
at android.hardware.camera2.legacy.RequestThreadManager$5.handleMessage(RequestThreadManager.java:830)
at android.os.Handler.dispatchMessage(Handler.java:101)
at android.os.Looper.loop(Looper.java:164)
at android.os.HandlerThread.run(HandlerThread.java:65)
I have this code:
I'm targeting for 28 SDK version. I thought the problem may have something with thread so I tried to initialize thread in main but with no success.
public class CameraService {
// Size of taken photo
private static final int IMAGE_WIDTH = 1280;
private static final int IMAGE_HEIGHT = 960;
private CameraDevice mCameraDevice;
private CameraCaptureSession mCameraCaptureSession;
private HandlerThread backgroundThread;
private Handler backgroundHandler;
private ImageReader mImageReader;
private CameraService() {
}
private static class InstanceHolder {
private static CameraService sCameraService = new CameraService();
}
public static CameraService getInstance() {
return InstanceHolder.sCameraService;
}
public void initializeCamera(Context context,
ImageReader.OnImageAvailableListener onImageAvailableListener) {
CameraManager cameraManager = (CameraManager) context.getSystemService(CAMERA_SERVICE);
String[] camIds = {};
try {
camIds = cameraManager.getCameraIdList();
} catch (CameraAccessException e) {
e.printStackTrace();
}
if( camIds.length < 1) {
Log.e("CameraService", "Camera not available.");
return;
}
startBackgroundThread();
mImageReader = ImageReader.newInstance(IMAGE_WIDTH, IMAGE_HEIGHT, ImageFormat.JPEG, 2);
mImageReader.setOnImageAvailableListener(onImageAvailableListener, backgroundHandler);
try {
cameraManager.openCamera(camIds[0], mStateCallback, backgroundHandler);
} catch (SecurityException e) {
e.printStackTrace();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void startBackgroundThread() {
backgroundThread = new HandlerThread("CameraBackground");
backgroundThread.start();
backgroundHandler = new Handler(backgroundThread.getLooper());
}
private void stopBackgroundThread() {
backgroundThread.quitSafely();
try {
backgroundThread.join();
backgroundThread = null;
backgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice cameraDevice) {
mCameraDevice = cameraDevice;
}
#Override
public void onDisconnected(#NonNull CameraDevice cameraDevice) {
cameraDevice.close();
mCameraDevice = null;
}
#Override
public void onError(#NonNull CameraDevice cameraDevice, int i) {
cameraDevice.close();
mCameraDevice = null;
}
#Override
public void onClosed(#NonNull CameraDevice camera) {
mCameraDevice = null;
}
};
public void takePicture() {
Log.d("CameraService", "takePicture()");
if( mCameraDevice == null) {
Log.d("CameraService", "Cannot take picture. Camera device is null.");
return;
}
try {
mCameraDevice.createCaptureSession(Collections.singletonList(mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession cameraCaptureSession) {
if( mCameraDevice == null) {
Log.e("mStateCallback", " mStateCallbackCaptureSession configured");
return;
}
Log.d("CameraService", "imageCapture()");
mCameraCaptureSession = cameraCaptureSession;
imageCapture();
}
#Override
public void onConfigureFailed(CameraCaptureSession cameraCaptureSession) {
Log.e("mStateCallback", "Configure failed");
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void imageCapture() {
Log.d("CameraService", "imageCapture()");
try {
final CaptureRequest.Builder builder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
builder.addTarget(mImageReader.getSurface());
builder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON);
mCameraCaptureSession.stopRepeating();
mCameraCaptureSession.capture(builder.build(), mCaptureCallback, null);
} catch (CameraAccessException e) {
Log.e("imagecapture()", "KOKOTKO");
e.printStackTrace();
}
}
private final CameraCaptureSession.CaptureCallback mCaptureCallback =
new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureStarted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
}
#Override
public void onCaptureProgressed(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull CaptureResult partialResult) {
super.onCaptureProgressed(session, request, partialResult);
}
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
if( session != null) {
session.close();
}
}
};
public void shutdown() {
Log.d("CameraService", "shutdown()");
if( mCameraDevice != null) {
mCameraDevice.close();
}
if( mCameraCaptureSession != null) {
mCameraCaptureSession.close();
}
stopBackgroundThread();
}
}
Thanks
I have same issue with camera2 Api in second time capture image. That problem cause you not close image image.close in image reader listener after get bytes;
I hope this help
I'm trying to create a barcode scanner with the camera2 and ML-kit API from Google. I've finally managed to get the preview working, but have no idea how to get the picture itself and pass that on to the ML-kit API.
I've tried using the image reader class, but somehow the onImageAvailable class is not being called.
Here is the code:
import ...
public class barcodeScannerActivity extends AppCompatActivity
implements OnRequestPermissionsResultCallback {
CameraManager mCameraManager;
SurfaceView mSurfaceViewPreview;
Surface mSurfacePreview;
CaptureRequest.Builder mPreviewRequestBuilder;
CaptureRequest mPreviewRequest;
List<Surface> mSurfaceList;
ImageReader mImageReader;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_barcode_scanner);
FirebaseVisionBarcodeDetectorOptions options =
new FirebaseVisionBarcodeDetectorOptions.Builder()
.setBarcodeFormats(
FirebaseVisionBarcode.FORMAT_EAN_13,
FirebaseVisionBarcode.FORMAT_EAN_8,
FirebaseVisionBarcode.FORMAT_UPC_A,
FirebaseVisionBarcode.FORMAT_UPC_E)
.build();
FirebaseVisionBarcodeDetector detector = FirebaseVision.getInstance().
getVisionBarcodeDetector(options);
initCamera();
}
private void initCamera() {
try {
Log.d("debug", "camera initiated...");
int permission = ContextCompat.checkSelfPermission(getApplicationContext(),
android.Manifest.permission.CAMERA);
int granted = PackageManager.PERMISSION_GRANTED;
if(permission == granted) {
mCameraManager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
String[] cameraIdList = mCameraManager.getCameraIdList();
if(cameraIdList.length == 0) {
throw new Exception("No camera found", null);
}
String backFacingCameraID = getBackFacingCameraID(cameraIdList);
if(backFacingCameraID != null) {
mSurfaceList = new ArrayList<>();
mSurfaceViewPreview = findViewById(R.id.barcodeScanner);
mSurfacePreview = mSurfaceViewPreview.getHolder().getSurface();
mSurfaceList.add(mSurfacePreview);
mCameraManager.openCamera(backFacingCameraID, cameraCallback, null);
} else {
//show error message that no backfacing camera is found.
}
} else {
ActivityCompat.requestPermissions(this,
new String[] {android.Manifest.permission.CAMERA}, 0);
}
} catch(Exception e) {
e.printStackTrace();
}
}
private CameraDevice.StateCallback cameraCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice cameraDevice) {
try {
mPreviewRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(mSurfacePreview);
mPreviewRequest = mPreviewRequestBuilder.build();
cameraDevice.createCaptureSession(mSurfaceList, stateCallback, null);
} catch(Exception e) {
e.printStackTrace();
}
}
#Override
public void onDisconnected(#NonNull CameraDevice cameraDevice) {}
#Override
public void onError(#NonNull CameraDevice cameraDevice, int i) {}
};
private CameraCaptureSession.StateCallback stateCallback = new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
try {
cameraCaptureSession.setRepeatingRequest(mPreviewRequest, captureCallback ,null);
} catch(Exception e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {}
};
private CameraCaptureSession.CaptureCallback captureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureStarted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, long timestamp, long frameNumber) {
super.onCaptureStarted(session, request, timestamp, frameNumber);
}
#Override
public void onCaptureProgressed(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull CaptureResult partialResult) {
super.onCaptureProgressed(session, request, partialResult);
}
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Log.d("debug", String.valueOf(result.getPartialResults()));
}
#Override
public void onCaptureFailed(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull CaptureFailure failure) {
super.onCaptureFailed(session, request, failure);
}
};
private String getBackFacingCameraID(String[] cameraIdList) {
String backFacingCameraID = null;
try {
for(String cameraID:cameraIdList) {
CameraCharacteristics characteristics =
mCameraManager.getCameraCharacteristics(cameraID);
if(characteristics.get(CameraCharacteristics.LENS_FACING) == 1) {
backFacingCameraID = cameraID;
}
}
} catch (Exception e) {
e.printStackTrace();
}
return backFacingCameraID;
}
#Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
initCamera();
}
public void backButton(View view) {}
}
It seems that I forgot to add:
mPreviewRequestBuilder.addTarget(mImageReaderSurface);
Also, the camera froze after 5 frames. To help this, please refer to Camera2 ImageReader freezes repeating capture request
I'm trying to develop an application, based on Google Camera2Basic example, that uses Camera 2 API.
I would like to add a button that allows me to switch the flash mode.
The problem is when I click on the button to switch between different flash modes, the captureBuilder doesn't set the correct flash mode. It is work only when I open the camera for the first time.
setFlash method:
private void setFlash(CaptureRequest.Builder requestBuilder) {
if (mFlashSupported) {
switch (mFlashMode) {
case FLASH_AUTO:
requestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);
requestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
break;
case FLASH_ON:
rrequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CameraMetadata.CONTROL_AE_MODE_ON_ALWAYS_FLASH);
requestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_SINGLE);
break;
case FLASH_OFF:
requestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CameraMetadata.CONTROL_AE_MODE_ON);
requestBuilder.set(CaptureRequest.FLASH_MODE, CameraMetadata.FLASH_MODE_OFF);
break;
}
}
}
createCameraPreviewSession method:
private void createCameraPreviewSession() {
try {
...
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
if (null == mCameraDevice) {
return;
}
mCaptureSession = cameraCaptureSession;
try {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
setFlash(mPreviewRequestBuilder);
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
...
}
}
#Override
public void onConfigureFailed(
#NonNull CameraCaptureSession cameraCaptureSession) {
...
}
}, null
);
} catch (CameraAccessException e) {
...
}
}
captureStillPicture method:
private void captureStillPicture() {
try {
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(mImageReader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE,
CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
setFlash(captureBuilder);
CameraCaptureSession.CaptureCallback CaptureCallback
= new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session,
#NonNull CaptureRequest request,
#NonNull TotalCaptureResult result) {
unlockFocus();
}
};
mCaptureSession.stopRepeating();
mCaptureSession.capture(captureBuilder.build(), CaptureCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
unlockFocus method:
private void unlockFocus() {
try {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
CameraMetadata.CONTROL_AF_TRIGGER_CANCEL);
mCaptureSession.capture(mPreviewRequestBuilder.build(), mCaptureCallback,
mBackgroundHandler);
// After this, the camera will go back to the normal state of preview.
setFlash(mPreviewRequestBuilder);
mState = STATE_PREVIEW;
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback,
mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
How could I fix this problem?
Thank you.
You need to make sure you call
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback,
mBackgroundHandler);
or equivalent after you update the request builder's flash mode. The request builder just creates requests - you still have to send them to the camera once you've set the new values you want.
The basic idea is if no face detected,it should return to Main(reset Activity) but after returning back to main ,getCameraInstance(0) returns null.
If i want to back to Main ,it will call the onPause() and release the cam and after main re-created the backCamera is already relased and it should problem-free create a new instance of back camera.Am i wrong ?
Thanks
ps :i get java.lang.RuntimeException: Fail to connect to camera service and that means i didnt release the camera correctly but i cant find my error.
Here is my code
public class MainActivity extends Activity{
....
private Camera mCameraBack=null;
private CameraPreviewBack mPreviewBack;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if(mCameraBack==null){
mCameraBack=getCameraInstance(0);
}
try{
mPreviewBack=new CameraPreviewBack(this,mCameraBack );
previewBack=(FrameLayout)findViewById(R.id.camera_preview_back);
previewBack.addView(mPreviewBack);
}catch(Exception ex){
Log.d(TAG,"surfaceCreate");
ex.printStackTrace();
}
public void capture(View view){
mPreviewBack.takePicture();
...
}
#Override
protected void onPause() {
Log.d(TAG, "onPause");
super.onPause();
if (mCameraBack != null) {
mCameraBack.stopPreview();
mCameraBack.release();
mCameraBack = null;
}
if (mPreviewBack != null) {
previewBack.removeView(mPreviewBack);
mPreviewBack = null;
}
}
#Override
public void onResume() {
Log.d(TAG, "onResume");
super.onResume();
if (mCameraBack == null) {
mCameraBack = getCameraInstance(0);
}
if (mPreviewBack == null) {
mPreviewBack=new CameraPreviewBack(this,mCameraBack );
previewBack.addView(mPreviewBack);
}
}
public static Camera getCameraInstance(int cameraId){
Camera c = null;
try {
c = Camera.open(cameraId);
}
catch (Exception e){
}
return c; // returns null if camera is unavailable
}
}
and
public class CameraPreviewBack extends SurfaceView implements SurfaceHolder.Callback {
...
public CameraPreviewBack(Context context,Camera camera) {
super(context);
mCamera=camera;
this.context=context;
mHolder=getHolder();
mHolder.addCallback(this);
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "Surface created");
try {
if(mCamera!=null){
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
}
} catch (IOException e) {
Log.d(TAG, "mHolder failure");
e.printStackTrace();
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width,
int height) {
Log.d(TAG, "Surface changed");
configureCamRotation();
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
}
public void takePicture(){
..
task.execute();
try {
task.get();
} catch (InterruptedException e) {
} catch (ExecutionException e) {
}
task.cancel(isFinished);
}
private PictureCallback getPictureCallback() {
PictureCallback mPicture = new PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera){
....
if(!detected){
Intent intentMain=new Intent(context, MainActivity.class);
intentMain.addFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP);
context.startActivity(intentMain);
}
...
}
private class TakePictureTask extends AsyncTask<Void, Void, Void> {
#Override
protected Void doInBackground(Void... params) {
mCamera.takePicture(null, null, getPictureCallback());
try {
Thread.sleep(2000); // 3 second preview
} catch (InterruptedException e) {
}
return null;
}
}
newer swallow exceptions
like here:
catch (Exception e){
}
Print this exception Log.d(TAG, "Error when init camera", e); and maybe you will get:
Runtime Exception http://developer.android.com/reference/android/hardware/Camera.html#open(int)
releasing camera onDestroy and changing Intents in onPostExecute method of Asynctask solved my problem