I am programming an Android app and on two devices my app crashes with the following Problem:
Update: There seems to be a problem, which was about the mediarecorder. Now I get a slightly different problem:
FATAL EXCEPTION: AsyncTask #3
Process: online.simpledesign.bikelog, PID: 14660
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:353)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:383)
at java.util.concurrent.FutureTask.setException(FutureTask.java:252)
at java.util.concurrent.FutureTask.run(FutureTask.java:271)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:245)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1162)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:636)
at java.lang.Thread.run(Thread.java:764)
Caused by: java.lang.RuntimeException: Can't create handler inside thread that has not called Looper.prepare()
at android.os.Handler.<init>(Handler.java:203)
at android.os.Handler.<init>(Handler.java:117)
at android.app.Dialog.<init>(Dialog.java:123)
at android.app.Dialog.<init>(Dialog.java:168)
at android.support.v7.app.AppCompatDialog.<init>(AppCompatDialog.java:46)
at android.support.v7.app.AlertDialog.<init>(AlertDialog.java:97)
at android.support.v7.app.AlertDialog$Builder.create(AlertDialog.java:980)
at online.simpledesign.bikelog.Record$MediaPrepareTask.doInBackground(Record.java:539)
at online.simpledesign.bikelog.Record$MediaPrepareTask.doInBackground(Record.java:524)
at android.os.AsyncTask$2.call(AsyncTask.java:333)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 4 more
The Code that is executed when the error occurs is:
class MediaPrepareTask extends AsyncTask<Void, Void, Boolean> {
#Override
protected Boolean doInBackground(Void... voids) {
// initialize video camera
if (prepareVideoRecorder()) {
// Camera is available and unlocked, MediaRecorder is prepared,
// now you can start recording
mMediaRecorder.start();
} else {
// prepare didn't work, release the camera
releaseMediaRecorder();
//Rückmeldung: Kamera konnte nicht gestartet werden
AlertDialog alertDialog = new AlertDialog.Builder(Record.this).create();
alertDialog.setTitle("Achtung");
alertDialog.setMessage("Kamera konnte nicht gestartet werden. Du kannst trotzdem die Daten aufzeichnen und mir so helfen.");
alertDialog.setButton(AlertDialog.BUTTON_POSITIVE, "Okay",
new DialogInterface.OnClickListener() {
public void onClick(DialogInterface dialog, int which) {
dialog.dismiss();
}
});
alertDialog.show();
return false;
}
return true;
}
#Override
protected void onPostExecute(Boolean result) {
if (!result) {
Record.this.finish();
}
}
}
I tried every solution here but non of them worked. Even more confusing for me is that on some devices it works without a problem.
Xperia XZ1 Compact with Android 8.0 and Google Pixel with Android 7.1 throw that problem.
Any advice is appreciated.
Edit: Here is what happens in prepareVideoRecorder:
private boolean prepareVideoRecorder() {
// BEGIN_INCLUDE (configure_preview)
mCamera = CameraHelper.getDefaultCameraInstance();
if(CamcorderProfile.hasProfile(0,CamcorderProfile.QUALITY_QVGA)){
// We need to make sure that our preview and recording video size are supported by the
// camera. Query camera to find all the sizes and choose the optimal size given the
// dimensions of our preview surface.
Camera.Parameters parameters = mCamera.getParameters();
List<Camera.Size> mSupportedPreviewSizes = parameters.getSupportedPreviewSizes();
List<Camera.Size> mSupportedVideoSizes = parameters.getSupportedVideoSizes();
Camera.Size optimalSize = CameraHelper.getOptimalVideoSize(mSupportedVideoSizes, mSupportedPreviewSizes, mPreview.getWidth(), mPreview.getHeight());
// Use the same size for recording profile.
CamcorderProfile profile = CamcorderProfile.get(CamcorderProfile.QUALITY_QVGA);
profile.videoFrameWidth = optimalSize.width;
profile.videoFrameHeight = optimalSize.height;
// likewise for the camera object itself.
parameters.setPreviewSize(profile.videoFrameWidth, profile.videoFrameHeight);
mCamera.setDisplayOrientation(90);
mCamera.setParameters(parameters);
try {
// Requires API level 11+, For backward compatibility use {#link setPreviewDisplay}
// with {#link SurfaceView}
mCamera.setPreviewTexture(mPreview.getSurfaceTexture());
} catch (IOException e) {
Log.e(TAG, "Surface texture is unavailable or unsuitable" + e.getMessage());
return false;
}
// END_INCLUDE (configure_preview)
// BEGIN_INCLUDE (configure_media_recorder)
mMediaRecorder = new MediaRecorder();
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mMediaRecorder.setOrientationHint(90);
mMediaRecorder.setCamera(mCamera);
// Step 2: Set sources
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
mMediaRecorder.setProfile(profile);
mMediaRecorder.setVideoEncodingBitRate(1000000);
// Step 4: Set output file
mOutputFile = CameraHelper.getOutputMediaFile(getFilesDir().getAbsolutePath());
if (mOutputFile == null) {
return false;
}
mMediaRecorder.setOutputFile(mOutputFile.getPath());
// END_INCLUDE (configure_media_recorder)
// Step 5: Prepare configured MediaRecorder
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
else {
return false;
}
}
In your method prepareVideoRecorder(), you should check first whether the profile you want to use, is supported or not -
if(CamcorderProfile.hasProfile()){
// then get profile ....
}
public static boolean hasProfile (int cameraId,
int quality);
If it does not support then you will always get the crash because the profile is not supported on that device. So you need to check first and if supported only then perform rest of the operations.
Related
The code is supposed to work but when I click on my video button to stop recording I receive a "fatal exception" crash and it shows me that the line mMediaRecorder.stop(); doesn't work.
Here's my code:
private CameraDevice.StateCallback mCameraDeviceStateCallback = new CameraDevice.StateCallback() {
#Override
public void onOpened(#NonNull CameraDevice camera) { //camera device member return back to me
mCameraDevice = camera;
// Toast.makeText(getApplicationContext(), "Camera connection made!", Toast.LENGTH_SHORT).show();
// ^indicates camera is connected
if (mIsRecording){
try{
createVideoFileName();
}
catch(IOException e) {
e.printStackTrace();
}
startRecord();
mMediaRecorder.start();
}
else {
startPreview();
}
}
//makes this marshallow version compatible for Android because from Marshallow version and onwards we needed...
//...the permissions required.
private void checkWriteStoragePermission(){
// v check to support applications on devices older than Marshmallow version in Android
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.lollipop) { //support Marshallow or later versions
// v check to see if granted permission
if (ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) ==
PackageManager.PERMISSION_GRANTED) {
mIsRecording = true;
mRecordImageButton.setImageResource(R.drawable.btn_video_busy); //not recording
try {
createVideoFileName(); //was giving an unhandled exception java long error so needed try/catch here
} catch (IOException e) {
e.printStackTrace();
}
startRecord();
mMediaRecorder.start();
}/* else { //check if permission was granted and had been denied
if (shouldShowRequestPermissionRationale(Manifest.permission.WRITE_EXTERNAL_STORAGE)) {
Toast.makeText(this, "app needs to be able to save videos", Toast.LENGTH_SHORT).show();
}
requestPermissions(new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE}, REQUEST_WRITE_EXTERNAL_STORAGE_PERMISSION_RESULT);
} */
}
else //if running older versions than Marshallow, not going to worry about it, so build "else" statement
{ // supporting old versions of Android
mIsRecording = true;
mRecordImageButton.setImageResource(R.drawable.btn_video_busy); //not recording
try{
createVideoFileName();
}
catch (IOException e){
e.printStackTrace();
}
startRecord();
mMediaRecorder.start();
}
}
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
createVideoFolder(); //creates the video folder when camera page loads
mMediaRecorder = new MediaRecorder();
mTextureView = (TextureView) findViewById(R.id.textureView);
mRecordImageButton = (ImageButton) findViewById(R.id.videoOnlineImageButton);
mRecordImageButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (mIsRecording) {
mIsRecording = false;
mRecordImageButton.setImageResource(R.drawable.btn_video_online);
mMediaRecorder.stop(); //<-- FATAL EXCEPTION CRASH
mMediaRecorder.reset();
startPreview(); //< this code says = once it's stops recording TextureView preview still continues
} else {
checkWriteStoragePermission();
}
}
});
}
private void createVideoFolder() {
File movieFile = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_MOVIES);
mVideoFolder = new File(movieFile, "camera2VideoImage");
if(!mVideoFolder.exists()) {
mVideoFolder.mkdirs();
}
}
12-23 22:23:07.509 14045-14045/com.example.name.videoscrolltrial E/MediaRecorder: stop failed: -1007
12-23 22:23:07.510 14045-14045/com.example.name.videoscrolltrial D/AndroidRuntime: Shutting down VM
12-23 22:23:07.511 14045-14045/com.example.name.videoscrolltrial E/AndroidRuntime: FATAL EXCEPTION: main
Process: com.example.name.videoscrolltrial, PID: 14045
java.lang.RuntimeException: stop failed.
at android.media.MediaRecorder._stop(Native Method)
at android.media.MediaRecorder.stop(MediaRecorder.java:1220)
at com.example.name.videoscrolltrial.Camera$3.onClick(Camera.java:192) //< causes the crash, this is mMediaRecorder.stop();
at android.view.View.performClick(View.java:6308)
at android.view.View$PerformClick.run(View.java:23969)
at android.os.Handler.handleCallback(Handler.java:751)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:154)
at android.app.ActivityThread.main(ActivityThread.java:6823)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1563)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1451)
This code works for phones like Nexus 5X but for my Samsung Galaxy Note 8 it doesn't just gives a crash. I don't understand what I have to do to make the code compatible for ALL phones?
You can't stop the mediaRecorder without starting it first.
I am working on video recording app in which i want to display preview and when user click on record button it start recording and when user click stop button it stop recording.
I got camera preview and recording back camera is working fine.
But when I flip camera to front camera and when I start recording it occurs error like this:
FATAL EXCEPTION: main java.lang.RuntimeException: start failed.
at android.media.MediaRecorder.start(Native Method) at
com.opkix.app.fragments.CameraFragment.startRecording(
CameraFragment.java:104)
Here's my code for recording video code:
private boolean prepareMediaRecorder() {
// set the orientation here to enable portrait recording.
mediaRecorder = new MediaRecorder();
mCamera.unlock();
mediaRecorder.setCamera(mCamera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
mediaRecorder.setOutputFile(StorageUtils.getOutputMediaFilePath());
mediaRecorder.setMaxDuration(120000); // Set max duration 60 sec.
mediaRecorder.setMaxFileSize(100000000); // Set max file size 50Mb
mediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
try {
mediaRecorder.prepare();
} catch (IllegalStateException e) {
releaseMediaRecorder();
return false;
} catch (IOException e) {
releaseMediaRecorder();
return false;
}
return true;
}
Can anyone please suggest solution?
now, i am also working on Video Recording App.Please one time Run your Code in some another device's.also i am sharing my Code with you.in which some of code is my App related code.remove that if you don't need that.
My Code :
private boolean prepareVideoRecorder() {
mRecorder = new MediaRecorder();
// Both are required for Portrait Video
mCamera.setDisplayOrientation(90);
if (mCameraId == CAMERA_FACING_FRONT) {
mRecorder.setOrientationHint(270);
} else {
mRecorder.setOrientationHint(90);
}
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mRecorder.setCamera(mCamera);
// Step 2: Set sources
mRecorder.setAudioSource(MediaRecorder.AudioSource.DEFAULT);
mRecorder.setVideoSource(MediaRecorder.VideoSource.DEFAULT);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
mRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_480P));
// Step 4: Set output file
final File folder;
if (Environment.getExternalStorageState().equals(Environment.MEDIA_MOUNTED)) {
folder = new File(Environment.getExternalStorageDirectory() + "/CameraApp/Videos");
} else {
folder = new File(Environment.getExternalStorageDirectory() + "/CameraApp/Videos");
}
boolean success = true;
File videoFile;
if (!folder.exists()) {
success = folder.mkdirs();
}
if (success) {
videoFile = new File(folder.getAbsolutePath() + File.separator + getFileNameCustomFormat() + " " + ".mp4");
SavedVideoPath = getFileNameCustomFormat() + " " + ".mp4";
Log.e("Video Path - ", SavedVideoPath);
} else {
Toast.makeText(getBaseContext(), "Video Not saved", Toast.LENGTH_SHORT).show();
return true;
}
mRecorder.setOutputFile(String.valueOf(videoFile));
// mRecorder.setVideoSize(mPreviewWidth, mPreviewHeight);
// Step 5: Set the preview output
mRecorder.setPreviewDisplay(mSurfaceHolder.getSurface());
// Step 6: Prepare configured MediaRecorder
try {
mRecorder.prepare();
} catch (IllegalStateException e) {
Toast.makeText(getApplicationContext(), "prepareVideoRecorder() Exception: " + e.getMessage(), Toast.LENGTH_LONG).show();
releaseMediaRecorder();
return false;
} catch (IOException e) {
Toast.makeText(getApplicationContext(), "prepareVideoRecorder() Exception: " + e.getMessage(), Toast.LENGTH_LONG).show();
releaseMediaRecorder();
return false;
}
return true;
}
And let me know what happen.? after trying my code.
Hope this will Helps :
I've had a look through the other people's encounters with this problem and have not found an adequate solution.
Like them, I followed the tutorial on camera functionality at: http://developer.android.com/guide/topics/media/camera.html.
Everything listed below works perfectly to the point where I assume that the program has recorded video as I had intended. However, upon reviewing the video in the gallery, it has not appeared. I'm confused as there are no IOExceptions or other bugs present when connected for USB debugging. Strangely, upon removing the USB and plugging it in again, whenever that may be, either immediately or at some point days in the future, all of the previously recorded videos appear in the gallery. Clearly there something I have missed or some aspect of recording video that I am not aware of. Would appreciate any help or guidance, thank you.
Pertinent code is as follows, I'll post more if someone needs it.
Camera Activity:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
mCamera = MainActivity.getCameraInstance();
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(mPreview);
button_capture = (Button) findViewById(R.id.button_capture);
button_capture.setOnClickListener(
new View.OnClickListener() {
#Override
public void onClick(View v) {
if (isRecording) {
// stop recording and release camera
mMediaRecorder.stop(); // stop the recording
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
// inform the user that recording has stopped
button_capture.setText("Capture");
isRecording = false;
} else {
// initialize video camera
if (prepareVideoRecorder()) {
// Camera is available and unlocked, MediaRecorder is prepared,
// now you can start recording
mMediaRecorder.start();
// inform the user that recording has started
button_capture.setText("Stop");
isRecording = true;
} else {
// prepare didn't work, release the camera
releaseMediaRecorder();
// inform user
}
}
}
});
}
private boolean prepareVideoRecorder(){
mMediaRecorder = new MediaRecorder();
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
// Step 2: Set sources
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
// Step 4: Set output file
mMediaRecorder.setOutputFile(MediaCapture.getOutputMediaFile(MEDIA_TYPE_VIDEO).toString());
// Step 5: Set the preview output
mMediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
// Step 6: Prepare configured MediaRecorder
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
Media Capture:
public static Uri getOutputMediaFileUri(int type) {
return Uri.fromFile(getOutputMediaFile(MEDIA_TYPE_VIDEO));
}
/**
* Create a File for saving an image or video
*/
public static File getOutputMediaFile(int type) {
// To be safe, you should check that the SDCard is mounted
// using Environment.getExternalStorageState() before doing this.
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_MOVIES), "MyApplication");
// This location works best if you want the created images to be shared
// between applications and persist after your app has been uninstalled.
// Create the storage directory if it does not exist
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("MyApplication", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HH:mm:ss").format(new Date());
File mediaFile;
if (type == MEDIA_TYPE_VIDEO) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"VID_" + timeStamp + ".mp4");
} else {
return null;
}
return mediaFile;
}
}
I too was very perplexed by this behaviour until I realised unplugging the USB was causing the files to appear. This appears to be a bug in Android, possibly related to this one :
https://code.google.com/p/android/issues/detail?id=195362
Clearly, the file has been written correctly, but for some reason the new file is not visible. In any case, the solution/workaround offered in the link above is to force a media scan of your file. e.g.
MediaScannerConnection.scanFile(getContext(), new String[]{this.mediaFile.getAbsolutePath()}, null, null);
Where 'mediaFile' is the file your mediaRecorder has just finished writing. With this in place I do not need to unplug the USB cable to see the file and it appears immediately after recording has finished.
I am running Android 5.0.2 on a Samsung Galaxy A5. This feels more of a workaround than a fix and I can't be sure it will work on other devices or Android versions, but I hope it helps someone.
I need a way to control the camera flash on an Android device while it is recording video. I'm making a strobe light app, and taking videos with a flashing strobe light would result in the ability to record objects that are moving at high speeds, like a fan blade.
The flash can only be enabled by starting a video preview and setting FLASH_MODE_TORCH in the camera's parameters. That would look like this:
Camera c = Camera.open();
Camera.Parameters p = c.getParameters();
p.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
c.setParameters(p);
c.startPreview();
Once the preview has started, I can flip that parameter back and forth to turn the light on and off. This works well until I try to record a video. The trouble is that in order to give the camera to the MediaRecorder, I first have to unlock it.
MediaRecorder m = new MediaRecorder();
c.unlock(); // the killer
m.setCamera(c);
After that unlock, I can no longer change the camera parameters and therefore have no way to change the flash state.
I do not know if it is actually possible to do this since I'm not the best at java-hacking, but here is what I do know:
Camera.unlock() is a native method, so I can't really see the mechanism behind the way it locks me out
Camera.Parameter has a HashMap that contains all of its parameters
Camera.setParameters(Parameters) takes the HashMap, converts it to a string, and passes it to a native method
I can eliminate all the parameters but TORCH-MODE from the HashMap and the Camera will still accept it
So, I can still access the Camera, but it won't listen to anything I tell it. (Which is kind of the purpose of Camera.unlock())
Edit:
After examining the native code, I can see that in CameraService.cpp my calls to Camera.setParameters(Parameters) get rejected because my Process ID does not match the Process ID the camera service has on record. So it would appear that that is my hurdle.
Edit2:
It would appear that the MediaPlayerService is the primary service that takes control of the camera when a video is recording. I do not know if it is possible, but if I could somehow start that service in my own process, I should be able to skip the Camera.unlock() call.
Edit3:
One last option would be if I could somehow get a pointer to the CameraHardwareInterface. From the looks of it, this is a device specific interface and probably does not include the PID checks. The main problem with this though is that the only place that I can find a pointer to it is in CameraService, and CameraService isn't talking.
Edit4: (several months later)
At this point, I don't think it is possible to do what I originally wanted. I don't want to delete the question on the off chance that someone does answer it, but I'm not actively seeking an answer. (Though, receiving a valid answer would be awesome.)
I encountered a similar issue. The user should be able to change the flash mode during recording to meet their needs depending on the light situation. After some investigative research i came to the following solution:
I assume, that you've already set up a proper SurfaceView and a SurfaceHolder with its necessary callbacks. The first thing i did was providing this code (not declared variables are globals):
public void surfaceCreated(SurfaceHolder holder) {
try {
camera = Camera.open();
parameters = camera.getParameters();
parameters.setFlashMode(Parameters.FLASH_MODE_OFF);
camera.setParameters(parameters);
camera.setPreviewDisplay(holder);
camera.startPreview();
recorder = new MediaRecorder();
} catch (IOException e) {
e.printStackTrace();
}
}
My next step was initializing and preparing the recorder:
private void initialize() {
camera.unlock();
recorder.setCamera(camera);
recorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
recorder.setVideoFrameRate(20);
recorder.setOutputFile(filePath);
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
finish();
} catch (IOException e) {
e.printStackTrace();
finish();
}
}
It's important to note, that camera.unlock() has to be called BEFORE the whole initialization process of the media recorder. That said also be aware of the proper order of each set property, otherwise you'll get an IllegalStateException when calling prepare() or start(). When it comes to recording, i do this. This will usually be triggered by a view element:
public void record(View view) {
if (recording) {
recorder.stop();
//TODO: do stuff....
recording = false;
} else {
recording = true;
initialize();
recorder.start();
}
}
So now, i finally can record properly. But what's with that flash? Last but not least, here comes the magic behind the scenes:
public void flash(View view) {
if(!recording) {
camera.lock();
}
parameters.setFlashMode(parameters.getFlashMode().equals(Parameters.FLASH_MODE_TORCH) ? Parameters.FLASH_MODE_OFF : Parameters.FLASH_MODE_TORCH);
camera.setParameters(parameters);
if(!recording) {
camera.unlock();
}
}
Everytime i call that method via an onClick action i can change the flash mode, even during recording. Just take care of properly locking the camera. Once the lock is aquired by the media recorder during recording, you don't have to lock/unlock the camera again. It doesn't even work. This was tested on a Samsung Galaxy S3 with Android-Version 4.1.2. Hope this approach helps.
After preparing media recorder, use camera.lock(), and then set whatever parameters you want to set to camera.
But before starting recording you need to call camera.unlock(), and after you stop media recorder you need to call camera.lock() to start preview.
Enjoy!!!
Try this.. hopefully it will work.. :)
private static Torch torch;
public Torch() {
super();
torch = this;
}
public static Torch getTorch() {
return torch;
}
private void getCamera() {
if (mCamera == null) {
try {
mCamera = Camera.open();
} catch (RuntimeException e) {
Log.e(TAG, "Camera.open() failed: " + e.getMessage());
}
}
}
public void toggleLight(View view) {
toggleLight();
}
private void toggleLight() {
if (lightOn) {
turnLightOff();
} else {
turnLightOn();
}
}
private void turnLightOn() {
if (!eulaAgreed) {
return;
}
if (mCamera == null) {
Toast.makeText(this, "Camera not found", Toast.LENGTH_LONG);
button.setBackgroundColor(COLOR_WHITE);
return;
}
lightOn = true;
Parameters parameters = mCamera.getParameters();
if (parameters == null) {
button.setBackgroundColor(COLOR_WHITE);
return;
}
List<String> flashModes = parameters.getSupportedFlashModes();
if (flashModes == null) {
button.setBackgroundColor(COLOR_WHITE);
return;
}
String flashMode = parameters.getFlashMode();
Log.i(TAG, "Flash mode: " + flashMode);
Log.i(TAG, "Flash modes: " + flashModes);
if (!Parameters.FLASH_MODE_TORCH.equals(flashMode)) {
if (flashModes.contains(Parameters.FLASH_MODE_TORCH)) {
parameters.setFlashMode(Parameters.FLASH_MODE_TORCH);
mCamera.setParameters(parameters);
button.setBackgroundColor(COLOR_LIGHT);
startWakeLock();
} else {
Toast.makeText(this, "Flash mode (torch) not supported",
Toast.LENGTH_LONG);
button.setBackgroundColor(COLOR_WHITE);
Log.e(TAG, "FLASH_MODE_TORCH not supported");
}
}
}
private void turnLightOff() {
if (lightOn) {
button.setBackgroundColor(COLOR_DARK);
lightOn = false;
if (mCamera == null) {
return;
}
Parameters parameters = mCamera.getParameters();
if (parameters == null) {
return;
}
List<String> flashModes = parameters.getSupportedFlashModes();
String flashMode = parameters.getFlashMode();
if (flashModes == null) {
return;
}
Log.i(TAG, "Flash mode: " + flashMode);
Log.i(TAG, "Flash modes: " + flashModes);
if (!Parameters.FLASH_MODE_OFF.equals(flashMode)) {
if (flashModes.contains(Parameters.FLASH_MODE_OFF)) {
parameters.setFlashMode(Parameters.FLASH_MODE_OFF);
mCamera.setParameters(parameters);
stopWakeLock();
} else {
Log.e(TAG, "FLASH_MODE_OFF not supported");
}
}
}
}
private void startPreview() {
if (!previewOn && mCamera != null) {
mCamera.startPreview();
previewOn = true;
}
}
private void stopPreview() {
if (previewOn && mCamera != null) {
mCamera.stopPreview();
previewOn = false;
}
}
private void startWakeLock() {
if (wakeLock == null) {
Log.d(TAG, "wakeLock is null, getting a new WakeLock");
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
Log.d(TAG, "PowerManager acquired");
wakeLock = pm.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, WAKE_LOCK_TAG);
Log.d(TAG, "WakeLock set");
}
wakeLock.acquire();
Log.d(TAG, "WakeLock acquired");
}
private void stopWakeLock() {
if (wakeLock != null) {
wakeLock.release();
Log.d(TAG, "WakeLock released");
}
}
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (Eula.show(this)) {
eulaAgreed = true;
}
setContentView(R.layout.main);
button = findViewById(R.id.button);
surfaceView = (SurfaceView) this.findViewById(R.id.surfaceview);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
disablePhoneSleep();
Log.i(TAG, "onCreate");
}
To access the device camera, you must declare the CAMERA permission in your Android Manifest. Also be sure to include the <uses-feature> manifest element to declare camera features used by your application. For example, if you use the camera and auto-focus feature, your Manifest should include the following:
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
A sample that checks for torch support might look something like this:
//Create camera and parameter objects
private Camera mCamera;
private Camera.Parameters mParameters;
private boolean mbTorchEnabled = false;
//... later in a click handler or other location, assuming that the mCamera object has already been instantiated with Camera.open()
mParameters = mCamera.getParameters();
//Get supported flash modes
List flashModes = mParameters.getSupportedFlashModes ();
//Make sure that torch mode is supported
//EDIT - wrong and dangerous to check for torch support this way
//if(flashModes != null && flashModes.contains("torch")){
if(flashModes != null && flashModes.contains(Camera.Parameters.FLASH_MODE_TORCH)){
if(mbTorchEnabled){
//Set the flash parameter to off
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
else{
//Set the flash parameter to use the torch
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
}
//Commit the camera parameters
mCamera.setParameters(mParameters);
mbTorchEnabled = !mbTorchEnabled;
}
To turn the torch on, you simply set the camera parameter Camera.Parameters.FLASH_MODE_TORCH
Camera mCamera;
Camera.Parameters mParameters;
//Get a reference to the camera/parameters
mCamera = Camera.open();
mParameters = mCamera.getParameters();
//Set the torch parameter
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
//Comit camera parameters
mCamera.setParameters(mParameters);
To turn the torch off, set Camera.Parameters.FLASH_MODE_OFF
I have some code I have been experimenting with to see what I can do with the camera device. This following code works, but I have some issues with it that I cannot seem to solve.
The first call never works. The first time running the code the onPictureTaken callback is never called, so the file is never written. However the camera goes through all the other steps, including making the fake shutter noise.
I can't seem to set the picture size to something other than whatever it defaults to. If I try to set it to something else, the code stops working. Does the same as above, where the camera goes through all the motions, but the onPictureTaken callback is never called.
When the pictures are saved to the DCIM folder, the taken pictures do not show up in the default 'Photos' app on my phone, unless i reboot the phone.
Is there any way through code to disable the shutter noise?
Sorry, the code is a little messy because its an experiment.
Also, this code is executed in a BroadcastReceiver
#Override
public void onReceive(Context context, Intent intent) {
// TODO Auto-generated method stub
if(intent.getAction().equals(TAKE_PICTURE_INTENT))
{
Toast.makeText(context, "Test", Toast.LENGTH_LONG).show();
System.out.println("GOT THE INTENT");
try
{
Camera camera = Camera.open();
System.out.println("CAMERA OPENED");
Parameters params = camera.getParameters();
params.set("flash-mode", "off");
params.set("focus-mode", "infinity");
params.set("jpeg-quality", "100");
//params.setPictureSize(2592, 1952);
String str = params.get("picture-size" + "-values");
System.out.println(str);
String size = str.split(",")[0];
System.out.println(size);
//params.set("picture-size", size);
camera.setParameters(params);
System.out.println("CAMERA PARAMETERS SET");
camera.startPreview();
System.out.println("CAMERA PREVIEW STARTED");
camera.autoFocus(new AutoFocusCallBackImpl());
}
catch(Exception ex)
{
System.out.println("CAMERA FAIL, SKIP");
return ;
}
}//if
}//onreceive
private void TakePicture(Camera camera)
{
camera.takePicture(new Camera.ShutterCallback() {
#Override
public void onShutter() {
// TODO Auto-generated method stub
System.out.println("CAMERA SHUTTER CALLBACK");
}
}
, null,
new Camera.PictureCallback() {
public void onPictureTaken(byte[] imageData, Camera c) {
//c.release();
System.out.println("CAMERA CALLBACK");
FileOutputStream outStream = null;
try {
System.out.println("Start Callback");
File esd = Environment.getExternalStorageDirectory();
outStream = new FileOutputStream(esd.getAbsolutePath() + String.format(
"/DCIM/%d.jpg", System.currentTimeMillis()));
outStream.write(imageData);
outStream.close();
System.out.println( "onPictureTaken - wrote bytes: " + imageData.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
System.out.println("File not found exception");
} catch (IOException e) {
e.printStackTrace();
System.out.println("IO exception");
} finally {
System.out.println("Finally");
c.release();
}
}
}
);
//camera.release();
}//TAKE PICTURE
private class AutoFocusCallBackImpl implements Camera.AutoFocusCallback {
#Override
public void onAutoFocus(boolean success, Camera camera) {
//bIsAutoFocused = success; //update the flag used in onKeyDown()
System.out.println("Inside autofocus callback. autofocused="+success);
//play the autofocus sound
//MediaPlayer.create(CameraActivity.this, R.raw.auto_focus).start();
if(success)
{
System.out.println("AUTO FOCUS SUCCEDED");
}
else
{
System.out.println("AUTO FOCUS FAILED");
}
TakePicture(camera);
System.out.println("CALLED TAKE PICTURE");
}
}//AUTOFOCUSCALLBACK
1.First of all put all camera logic out of BroadCast receiver & put it into seprate Activity.
2.
When the pictures are saved to the DCIM folder, the taken pictures do not show up in the default 'Photos' app on my phone, unless i reboot the phone.
because MediaScanner needs to be called to rescan images/changes once you take photo. When u reboot phone mediascanner scans media & finds new images. for this isuue you should check out MediaScanner.
3.Follow Android Camera Tutorial & Camera API
-Thanks