MediaRecorder not saving files - java

I've had a look through the other people's encounters with this problem and have not found an adequate solution.
Like them, I followed the tutorial on camera functionality at: http://developer.android.com/guide/topics/media/camera.html.
Everything listed below works perfectly to the point where I assume that the program has recorded video as I had intended. However, upon reviewing the video in the gallery, it has not appeared. I'm confused as there are no IOExceptions or other bugs present when connected for USB debugging. Strangely, upon removing the USB and plugging it in again, whenever that may be, either immediately or at some point days in the future, all of the previously recorded videos appear in the gallery. Clearly there something I have missed or some aspect of recording video that I am not aware of. Would appreciate any help or guidance, thank you.
Pertinent code is as follows, I'll post more if someone needs it.
Camera Activity:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
mCamera = MainActivity.getCameraInstance();
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(mPreview);
button_capture = (Button) findViewById(R.id.button_capture);
button_capture.setOnClickListener(
new View.OnClickListener() {
#Override
public void onClick(View v) {
if (isRecording) {
// stop recording and release camera
mMediaRecorder.stop(); // stop the recording
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
// inform the user that recording has stopped
button_capture.setText("Capture");
isRecording = false;
} else {
// initialize video camera
if (prepareVideoRecorder()) {
// Camera is available and unlocked, MediaRecorder is prepared,
// now you can start recording
mMediaRecorder.start();
// inform the user that recording has started
button_capture.setText("Stop");
isRecording = true;
} else {
// prepare didn't work, release the camera
releaseMediaRecorder();
// inform user
}
}
}
});
}
private boolean prepareVideoRecorder(){
mMediaRecorder = new MediaRecorder();
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
// Step 2: Set sources
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
// Step 4: Set output file
mMediaRecorder.setOutputFile(MediaCapture.getOutputMediaFile(MEDIA_TYPE_VIDEO).toString());
// Step 5: Set the preview output
mMediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
// Step 6: Prepare configured MediaRecorder
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
Media Capture:
public static Uri getOutputMediaFileUri(int type) {
return Uri.fromFile(getOutputMediaFile(MEDIA_TYPE_VIDEO));
}
/**
* Create a File for saving an image or video
*/
public static File getOutputMediaFile(int type) {
// To be safe, you should check that the SDCard is mounted
// using Environment.getExternalStorageState() before doing this.
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_MOVIES), "MyApplication");
// This location works best if you want the created images to be shared
// between applications and persist after your app has been uninstalled.
// Create the storage directory if it does not exist
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("MyApplication", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HH:mm:ss").format(new Date());
File mediaFile;
if (type == MEDIA_TYPE_VIDEO) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"VID_" + timeStamp + ".mp4");
} else {
return null;
}
return mediaFile;
}
}

I too was very perplexed by this behaviour until I realised unplugging the USB was causing the files to appear. This appears to be a bug in Android, possibly related to this one :
https://code.google.com/p/android/issues/detail?id=195362
Clearly, the file has been written correctly, but for some reason the new file is not visible. In any case, the solution/workaround offered in the link above is to force a media scan of your file. e.g.
MediaScannerConnection.scanFile(getContext(), new String[]{this.mediaFile.getAbsolutePath()}, null, null);
Where 'mediaFile' is the file your mediaRecorder has just finished writing. With this in place I do not need to unplug the USB cable to see the file and it appears immediately after recording has finished.
I am running Android 5.0.2 on a Samsung Galaxy A5. This feels more of a workaround than a fix and I can't be sure it will work on other devices or Android versions, but I hope it helps someone.

Related

How to Freeze/Lock android Camera X preview with Flash Light updates on taking picture?

I am implementing Camera X. The issue i am facing is to implement a mechanism to lock/freeze camera preview when picture is captured. Currently i have implement a workaround but it doesn't work well if the flash light is on while capturing. I get a frame from previewView (PreviewView) previewView.getBitmap() as before capturing the image and then display in an captureImage (ImageView). But the the freeze frame not show flash light update. My current code is below
private void capturePhoto() {
showProgress(true);
// Get the Information to be used & stored with Image
ContentValues contentValues = getImageSaveInfo();
Uri externalUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
ImageCapture.OutputFileOptions options = new ImageCapture.OutputFileOptions
.Builder(getContentResolver(), externalUri, contentValues)
.build();
// Play the Capture Sound when a picture is captured.
playCameraShutterSound();
// Display current frame From Preview in ImageView.
freezePreview(true);
imageCapture.takePicture(options,
ContextCompat.getMainExecutor(this),
new ImageCapture.OnImageSavedCallback() {
#Override
public void onImageSaved(#NonNull ImageCapture.OutputFileResults results) {
ToastUtility.successToast(getApplicationContext(),
"Photo Capture Successfully");
// Update Last Taken Image View with new Image
getLastTakenImage();
if (results.getSavedUri() != null) {
Log.d(TAG, "Image Saved At -> " + results.getSavedUri().toString());
}
showProgress(false);
freezePreview(false);
}
#Override
public void onError(#NonNull ImageCaptureException exception) {
ToastUtility.errorToast(getApplicationContext(),
"Photo Couldn't Capture");
Log.d(TAG, "Image Capture Error -> " + exception.getMessage());
showProgress(false);
freezePreview(false);
}
});
}
private void freezePreview(boolean value) {
if (value) {
Bitmap bitmap = mainBinding.previewView.getBitmap();
Glide.with(getApplicationContext())
.load(bitmap).into(mainBinding.captureImage);
mainBinding.captureImage.setVisibility(View.VISIBLE);
mainBinding.previewView.setVisibility(View.INVISIBLE);
} else {
mainBinding.previewView.setVisibility(View.VISIBLE);
mainBinding.captureImage.setVisibility(View.INVISIBLE);
}
}
The flash is triggered at some point after takePicture() is called, there isn't a callback for it in CameraX, so there isn't a direct way to know when it's fired.
You can instead use camera2 interop to indirectly check for the flash state. You can add a session CaptureCallback to ImageCapture's config, then inside the callback's onCaptureCompleted, check if the flash state of the total result is FIRED.
// Override onCaptureCompleted to check for the flash state
CameraCaptureSession.CaptureCallback sessionCaptureCallback = //... ;
// Initialize an ImageCapture builder
ImageCapture.Builder configBuilder = new ImageCapture.Builder();
// Add the session CaptureCallback to it
new Camera2Interop.Extender<>(configBuilder)
.setSessionCaptureCallback(sessionCaptureCallback);
// Build the ImageCapture use case
ImageCapture useCase = configBuilder.build();

Attempting to record mic input over Bluetooth

I have an Arduino which broadcasts via Bluetooth, input from a mic.
I want to connect the phone so that it'll record and save the input from the Arduino mic via Bluetooth.
When I run the following code, I have a few issues.
I can't seem to find the file I saved.
File file=new File(mFilePath2,"test.txt");
On Logcat I'm getting the following errors when I run Bluetooth_Test()
ACDB-LOADER - Error: ACDB AudProc vol returned = -19
MediaPlayer - Error (1, -1004)
When I run stop() I get:
MPEG4Writer - Stop() called but track is not started
MediaPlayer-JNI QCMediaPlayer mediaplayer NOT present
MediaPlayer - Should have subtitle controller already set
I'm not sure what is happening and I'm not sure how to figure it out.
References:
Sony - Use Bluetooth for audio I/O
Media Player called in state 0, error (-38,0) (it solved one error, not listed above)
Code:
public void Bluetooth_Test (){
Toast.makeText(getActivity(), "weee", Toast.LENGTH_LONG).show();
maudioManager = (AudioManager) getActivity().getSystemService(getActivity().AUDIO_SERVICE);
// Switch to headset
maudioManager.setMode(AudioManager.MODE_IN_CALL); // to use headset's I/O and not phone's
// Start audio I/O operation (in background)
maudioManager.startBluetoothSco();
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); // set source to current mic (should be Bluetooth)
mFilePath = Environment.getExternalStorageDirectory().getAbsolutePath();
String mFilePath2 = mFilePath;
mFilePath += "/youraudiofile.mp3";
File file=new File(mFilePath2,"test.txt");
// Set file extension for the recorded audio file
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// Set a file path for the recorded audio file
mRecorder.setOutputFile(mFilePath);
// Set encoding of the audio
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
}
catch (IOException e){
Log.e("Starting mRecorder", "IO Exception");
}
// Start recording
mRecorder.start();
}
public void stop () {
mRecorder.stop();
mRecorder.reset();
mRecorder.release();
mRecorder = null;
final MediaPlayer mPlayer = new MediaPlayer();
try {
mPlayer.setDataSource(mFilePath);
mPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mPlayer.start(); // Start playing audio file
}
});
// Audio file to be played
mPlayer.prepareAsync();
} catch (IOException e){
Log.e("Stop Function", "IO Exception");
}
}

Enabling Camera Flash While Recording Video

I need a way to control the camera flash on an Android device while it is recording video. I'm making a strobe light app, and taking videos with a flashing strobe light would result in the ability to record objects that are moving at high speeds, like a fan blade.
The flash can only be enabled by starting a video preview and setting FLASH_MODE_TORCH in the camera's parameters. That would look like this:
Camera c = Camera.open();
Camera.Parameters p = c.getParameters();
p.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
c.setParameters(p);
c.startPreview();
Once the preview has started, I can flip that parameter back and forth to turn the light on and off. This works well until I try to record a video. The trouble is that in order to give the camera to the MediaRecorder, I first have to unlock it.
MediaRecorder m = new MediaRecorder();
c.unlock(); // the killer
m.setCamera(c);
After that unlock, I can no longer change the camera parameters and therefore have no way to change the flash state.
I do not know if it is actually possible to do this since I'm not the best at java-hacking, but here is what I do know:
Camera.unlock() is a native method, so I can't really see the mechanism behind the way it locks me out
Camera.Parameter has a HashMap that contains all of its parameters
Camera.setParameters(Parameters) takes the HashMap, converts it to a string, and passes it to a native method
I can eliminate all the parameters but TORCH-MODE from the HashMap and the Camera will still accept it
So, I can still access the Camera, but it won't listen to anything I tell it. (Which is kind of the purpose of Camera.unlock())
Edit:
After examining the native code, I can see that in CameraService.cpp my calls to Camera.setParameters(Parameters) get rejected because my Process ID does not match the Process ID the camera service has on record. So it would appear that that is my hurdle.
Edit2:
It would appear that the MediaPlayerService is the primary service that takes control of the camera when a video is recording. I do not know if it is possible, but if I could somehow start that service in my own process, I should be able to skip the Camera.unlock() call.
Edit3:
One last option would be if I could somehow get a pointer to the CameraHardwareInterface. From the looks of it, this is a device specific interface and probably does not include the PID checks. The main problem with this though is that the only place that I can find a pointer to it is in CameraService, and CameraService isn't talking.
Edit4: (several months later)
At this point, I don't think it is possible to do what I originally wanted. I don't want to delete the question on the off chance that someone does answer it, but I'm not actively seeking an answer. (Though, receiving a valid answer would be awesome.)
I encountered a similar issue. The user should be able to change the flash mode during recording to meet their needs depending on the light situation. After some investigative research i came to the following solution:
I assume, that you've already set up a proper SurfaceView and a SurfaceHolder with its necessary callbacks. The first thing i did was providing this code (not declared variables are globals):
public void surfaceCreated(SurfaceHolder holder) {
try {
camera = Camera.open();
parameters = camera.getParameters();
parameters.setFlashMode(Parameters.FLASH_MODE_OFF);
camera.setParameters(parameters);
camera.setPreviewDisplay(holder);
camera.startPreview();
recorder = new MediaRecorder();
} catch (IOException e) {
e.printStackTrace();
}
}
My next step was initializing and preparing the recorder:
private void initialize() {
camera.unlock();
recorder.setCamera(camera);
recorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
recorder.setVideoFrameRate(20);
recorder.setOutputFile(filePath);
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
finish();
} catch (IOException e) {
e.printStackTrace();
finish();
}
}
It's important to note, that camera.unlock() has to be called BEFORE the whole initialization process of the media recorder. That said also be aware of the proper order of each set property, otherwise you'll get an IllegalStateException when calling prepare() or start(). When it comes to recording, i do this. This will usually be triggered by a view element:
public void record(View view) {
if (recording) {
recorder.stop();
//TODO: do stuff....
recording = false;
} else {
recording = true;
initialize();
recorder.start();
}
}
So now, i finally can record properly. But what's with that flash? Last but not least, here comes the magic behind the scenes:
public void flash(View view) {
if(!recording) {
camera.lock();
}
parameters.setFlashMode(parameters.getFlashMode().equals(Parameters.FLASH_MODE_TORCH) ? Parameters.FLASH_MODE_OFF : Parameters.FLASH_MODE_TORCH);
camera.setParameters(parameters);
if(!recording) {
camera.unlock();
}
}
Everytime i call that method via an onClick action i can change the flash mode, even during recording. Just take care of properly locking the camera. Once the lock is aquired by the media recorder during recording, you don't have to lock/unlock the camera again. It doesn't even work. This was tested on a Samsung Galaxy S3 with Android-Version 4.1.2. Hope this approach helps.
After preparing media recorder, use camera.lock(), and then set whatever parameters you want to set to camera.
But before starting recording you need to call camera.unlock(), and after you stop media recorder you need to call camera.lock() to start preview.
Enjoy!!!
Try this.. hopefully it will work.. :)
private static Torch torch;
public Torch() {
super();
torch = this;
}
public static Torch getTorch() {
return torch;
}
private void getCamera() {
if (mCamera == null) {
try {
mCamera = Camera.open();
} catch (RuntimeException e) {
Log.e(TAG, "Camera.open() failed: " + e.getMessage());
}
}
}
public void toggleLight(View view) {
toggleLight();
}
private void toggleLight() {
if (lightOn) {
turnLightOff();
} else {
turnLightOn();
}
}
private void turnLightOn() {
if (!eulaAgreed) {
return;
}
if (mCamera == null) {
Toast.makeText(this, "Camera not found", Toast.LENGTH_LONG);
button.setBackgroundColor(COLOR_WHITE);
return;
}
lightOn = true;
Parameters parameters = mCamera.getParameters();
if (parameters == null) {
button.setBackgroundColor(COLOR_WHITE);
return;
}
List<String> flashModes = parameters.getSupportedFlashModes();
if (flashModes == null) {
button.setBackgroundColor(COLOR_WHITE);
return;
}
String flashMode = parameters.getFlashMode();
Log.i(TAG, "Flash mode: " + flashMode);
Log.i(TAG, "Flash modes: " + flashModes);
if (!Parameters.FLASH_MODE_TORCH.equals(flashMode)) {
if (flashModes.contains(Parameters.FLASH_MODE_TORCH)) {
parameters.setFlashMode(Parameters.FLASH_MODE_TORCH);
mCamera.setParameters(parameters);
button.setBackgroundColor(COLOR_LIGHT);
startWakeLock();
} else {
Toast.makeText(this, "Flash mode (torch) not supported",
Toast.LENGTH_LONG);
button.setBackgroundColor(COLOR_WHITE);
Log.e(TAG, "FLASH_MODE_TORCH not supported");
}
}
}
private void turnLightOff() {
if (lightOn) {
button.setBackgroundColor(COLOR_DARK);
lightOn = false;
if (mCamera == null) {
return;
}
Parameters parameters = mCamera.getParameters();
if (parameters == null) {
return;
}
List<String> flashModes = parameters.getSupportedFlashModes();
String flashMode = parameters.getFlashMode();
if (flashModes == null) {
return;
}
Log.i(TAG, "Flash mode: " + flashMode);
Log.i(TAG, "Flash modes: " + flashModes);
if (!Parameters.FLASH_MODE_OFF.equals(flashMode)) {
if (flashModes.contains(Parameters.FLASH_MODE_OFF)) {
parameters.setFlashMode(Parameters.FLASH_MODE_OFF);
mCamera.setParameters(parameters);
stopWakeLock();
} else {
Log.e(TAG, "FLASH_MODE_OFF not supported");
}
}
}
}
private void startPreview() {
if (!previewOn && mCamera != null) {
mCamera.startPreview();
previewOn = true;
}
}
private void stopPreview() {
if (previewOn && mCamera != null) {
mCamera.stopPreview();
previewOn = false;
}
}
private void startWakeLock() {
if (wakeLock == null) {
Log.d(TAG, "wakeLock is null, getting a new WakeLock");
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
Log.d(TAG, "PowerManager acquired");
wakeLock = pm.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, WAKE_LOCK_TAG);
Log.d(TAG, "WakeLock set");
}
wakeLock.acquire();
Log.d(TAG, "WakeLock acquired");
}
private void stopWakeLock() {
if (wakeLock != null) {
wakeLock.release();
Log.d(TAG, "WakeLock released");
}
}
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (Eula.show(this)) {
eulaAgreed = true;
}
setContentView(R.layout.main);
button = findViewById(R.id.button);
surfaceView = (SurfaceView) this.findViewById(R.id.surfaceview);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
disablePhoneSleep();
Log.i(TAG, "onCreate");
}
To access the device camera, you must declare the CAMERA permission in your Android Manifest. Also be sure to include the <uses-feature> manifest element to declare camera features used by your application. For example, if you use the camera and auto-focus feature, your Manifest should include the following:
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
A sample that checks for torch support might look something like this:
//Create camera and parameter objects
private Camera mCamera;
private Camera.Parameters mParameters;
private boolean mbTorchEnabled = false;
//... later in a click handler or other location, assuming that the mCamera object has already been instantiated with Camera.open()
mParameters = mCamera.getParameters();
//Get supported flash modes
List flashModes = mParameters.getSupportedFlashModes ();
//Make sure that torch mode is supported
//EDIT - wrong and dangerous to check for torch support this way
//if(flashModes != null && flashModes.contains("torch")){
if(flashModes != null && flashModes.contains(Camera.Parameters.FLASH_MODE_TORCH)){
if(mbTorchEnabled){
//Set the flash parameter to off
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
else{
//Set the flash parameter to use the torch
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
}
//Commit the camera parameters
mCamera.setParameters(mParameters);
mbTorchEnabled = !mbTorchEnabled;
}
To turn the torch on, you simply set the camera parameter Camera.Parameters.FLASH_MODE_TORCH
Camera mCamera;
Camera.Parameters mParameters;
//Get a reference to the camera/parameters
mCamera = Camera.open();
mParameters = mCamera.getParameters();
//Set the torch parameter
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
//Comit camera parameters
mCamera.setParameters(mParameters);
To turn the torch off, set Camera.Parameters.FLASH_MODE_OFF

Android MediaRecorder produces corrupt video with green lines

I'm trying to add video recording capability to my app using MediaRecorder in Android, but the resulting video looks corrupt with green lines (audio is fine). The following code is what I use to initialize the MediaRecorder object:
mMediaRecorder = new MediaRecorder();
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setOutputFile(Utility.CAPTURE_VIDEO_FILENAME);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mMediaRecorder.setMaxDuration(60000);
mMediaRecorder.setVideoFrameRate(20);
mMediaRecorder.setMaxFileSize(5000000);
mMediaRecorder.setVideoSize(352, 288);
mMediaRecorder.setPreviewDisplay(mPreview.mHolder.getSurface());
mMediaRecorder.prepare();
mMediaRecorder.start();
I've already looked at the suggestions here and here, but they don't seem to help my cause. I do think, however, that it might have something to do with incorrect video size. So my question is this: is there any good way to get compatible video sizes when using API level 7? As far as I can tell I can use CamcorderProfile if I'm in API level 8, but nothing in 7.
Video sizes in a device is equal to preview sizes. You have to first check whether video size you setting is available or not. Video sizes in different devices may be diffrent.so,first check available preview sizes using getSupportedPreviewSizes () and then set video size.if video size is incorrect green lines will come.
You can change these option and see how the qulity varies:
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
Following code will anyway record a video for you:
MediaRecorder recorder;
private void initRecorder() {
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
if (this.getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
recorder.setOrientationHint(90);//plays the video correctly
}else{
recorder.setOrientationHint(180);
}
recorder.setOutputFile("/sdcard/MediaAppVideos/"+randomNum+".mp4");
}
private void prepareRecorder() {
recorder.setPreviewDisplay(holder.getSurface());
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
try {
if (recording) {
recorder.stop();
recording = false;
}
recorder.release();
// finish();
} catch (Exception e) {
}
}
Check your code for the setRecordingHint(true);
http://developer.android.com/reference/android/hardware/Camera.Parameters.html#setRecordingHint(boolean)
The setting of this parameter cause this green glitches in the video on few devices.
For me green patches happened only on one device for 1920x1080. For higher or lower resolutions, recorded video is okay. When I set preview size to be same size as video size, I don't see any green strips

Trouble with Android Camera

I have some code I have been experimenting with to see what I can do with the camera device. This following code works, but I have some issues with it that I cannot seem to solve.
The first call never works. The first time running the code the onPictureTaken callback is never called, so the file is never written. However the camera goes through all the other steps, including making the fake shutter noise.
I can't seem to set the picture size to something other than whatever it defaults to. If I try to set it to something else, the code stops working. Does the same as above, where the camera goes through all the motions, but the onPictureTaken callback is never called.
When the pictures are saved to the DCIM folder, the taken pictures do not show up in the default 'Photos' app on my phone, unless i reboot the phone.
Is there any way through code to disable the shutter noise?
Sorry, the code is a little messy because its an experiment.
Also, this code is executed in a BroadcastReceiver
#Override
public void onReceive(Context context, Intent intent) {
// TODO Auto-generated method stub
if(intent.getAction().equals(TAKE_PICTURE_INTENT))
{
Toast.makeText(context, "Test", Toast.LENGTH_LONG).show();
System.out.println("GOT THE INTENT");
try
{
Camera camera = Camera.open();
System.out.println("CAMERA OPENED");
Parameters params = camera.getParameters();
params.set("flash-mode", "off");
params.set("focus-mode", "infinity");
params.set("jpeg-quality", "100");
//params.setPictureSize(2592, 1952);
String str = params.get("picture-size" + "-values");
System.out.println(str);
String size = str.split(",")[0];
System.out.println(size);
//params.set("picture-size", size);
camera.setParameters(params);
System.out.println("CAMERA PARAMETERS SET");
camera.startPreview();
System.out.println("CAMERA PREVIEW STARTED");
camera.autoFocus(new AutoFocusCallBackImpl());
}
catch(Exception ex)
{
System.out.println("CAMERA FAIL, SKIP");
return ;
}
}//if
}//onreceive
private void TakePicture(Camera camera)
{
camera.takePicture(new Camera.ShutterCallback() {
#Override
public void onShutter() {
// TODO Auto-generated method stub
System.out.println("CAMERA SHUTTER CALLBACK");
}
}
, null,
new Camera.PictureCallback() {
public void onPictureTaken(byte[] imageData, Camera c) {
//c.release();
System.out.println("CAMERA CALLBACK");
FileOutputStream outStream = null;
try {
System.out.println("Start Callback");
File esd = Environment.getExternalStorageDirectory();
outStream = new FileOutputStream(esd.getAbsolutePath() + String.format(
"/DCIM/%d.jpg", System.currentTimeMillis()));
outStream.write(imageData);
outStream.close();
System.out.println( "onPictureTaken - wrote bytes: " + imageData.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
System.out.println("File not found exception");
} catch (IOException e) {
e.printStackTrace();
System.out.println("IO exception");
} finally {
System.out.println("Finally");
c.release();
}
}
}
);
//camera.release();
}//TAKE PICTURE
private class AutoFocusCallBackImpl implements Camera.AutoFocusCallback {
#Override
public void onAutoFocus(boolean success, Camera camera) {
//bIsAutoFocused = success; //update the flag used in onKeyDown()
System.out.println("Inside autofocus callback. autofocused="+success);
//play the autofocus sound
//MediaPlayer.create(CameraActivity.this, R.raw.auto_focus).start();
if(success)
{
System.out.println("AUTO FOCUS SUCCEDED");
}
else
{
System.out.println("AUTO FOCUS FAILED");
}
TakePicture(camera);
System.out.println("CALLED TAKE PICTURE");
}
}//AUTOFOCUSCALLBACK
1.First of all put all camera logic out of BroadCast receiver & put it into seprate Activity.
2.
When the pictures are saved to the DCIM folder, the taken pictures do not show up in the default 'Photos' app on my phone, unless i reboot the phone.
because MediaScanner needs to be called to rescan images/changes once you take photo. When u reboot phone mediascanner scans media & finds new images. for this isuue you should check out MediaScanner.
3.Follow Android Camera Tutorial & Camera API
-Thanks

Categories

Resources