Android MediaRecorder produces corrupt video with green lines - java

I'm trying to add video recording capability to my app using MediaRecorder in Android, but the resulting video looks corrupt with green lines (audio is fine). The following code is what I use to initialize the MediaRecorder object:
mMediaRecorder = new MediaRecorder();
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setOutputFile(Utility.CAPTURE_VIDEO_FILENAME);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mMediaRecorder.setMaxDuration(60000);
mMediaRecorder.setVideoFrameRate(20);
mMediaRecorder.setMaxFileSize(5000000);
mMediaRecorder.setVideoSize(352, 288);
mMediaRecorder.setPreviewDisplay(mPreview.mHolder.getSurface());
mMediaRecorder.prepare();
mMediaRecorder.start();
I've already looked at the suggestions here and here, but they don't seem to help my cause. I do think, however, that it might have something to do with incorrect video size. So my question is this: is there any good way to get compatible video sizes when using API level 7? As far as I can tell I can use CamcorderProfile if I'm in API level 8, but nothing in 7.

Video sizes in a device is equal to preview sizes. You have to first check whether video size you setting is available or not. Video sizes in different devices may be diffrent.so,first check available preview sizes using getSupportedPreviewSizes () and then set video size.if video size is incorrect green lines will come.

You can change these option and see how the qulity varies:
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
Following code will anyway record a video for you:
MediaRecorder recorder;
private void initRecorder() {
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
if (this.getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
recorder.setOrientationHint(90);//plays the video correctly
}else{
recorder.setOrientationHint(180);
}
recorder.setOutputFile("/sdcard/MediaAppVideos/"+randomNum+".mp4");
}
private void prepareRecorder() {
recorder.setPreviewDisplay(holder.getSurface());
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
try {
if (recording) {
recorder.stop();
recording = false;
}
recorder.release();
// finish();
} catch (Exception e) {
}
}

Check your code for the setRecordingHint(true);
http://developer.android.com/reference/android/hardware/Camera.Parameters.html#setRecordingHint(boolean)
The setting of this parameter cause this green glitches in the video on few devices.

For me green patches happened only on one device for 1920x1080. For higher or lower resolutions, recorded video is okay. When I set preview size to be same size as video size, I don't see any green strips

Related

Recording video with MediaRecorder parallel to image processing

Most of the questions I found on the forum are related to recording processed image. Like here:
Recording Live OpenCV Processing on Android
However, my goal is to record the camera view while the processed frames are displayed on the screen. I tried recording with VideoWriter provided by OpenCV, but the FPS dropped twice while recording. I thought about adding MediaRecorder to JavaCameraView, but then I have to unlock the camera, which disables the preview.
public void startRecord() {
mCamera.unlock();
mediaRecorder = new MediaRecorder();
mediaRecorder.setCamera(mCamera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
mediaRecorder.setOutputFile("/storage/emulated/0/Android/data/com.ms.carcam/files/video1.mp4");
mediaRecorder.setVideoSize(1280,720);
try {
mediaRecorder.prepare();
} catch (IOException e) {
e.printStackTrace();
}
mediaRecorder.start();
}
How can I achive this?

How to Disable All Automatics in Android Camera2 API

I'm trying to disable auto-exposure, auto-focus, and auto-white-balance in Google's Camera2Basic sample. Here's my code:
private void disableAutomatics() {
try {
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_MODE, CaptureRequest.CONTROL_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE, CaptureRequest.CONTROL_VIDEO_STABILIZATION_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE, CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_OFF);
mPreviewRequestBuilder.set(CaptureRequest.LENS_FOCUS_DISTANCE, .2f);
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, 1000000L);
mPreviewRequest = mPreviewRequestBuilder.build();
// Set new repeating request with our changed one
mCaptureSession.setRepeatingRequest(mPreviewRequest, mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
The problem is I don't know where to place the method in Camera2BasicFragment.java.
Any help would be greatly appreciated.
There are two places where you may want to do those settings:
If you want to do it before the preview starts, the better place would be inside of the overridden method onConfigured within the createCameraPreviewSession() void (line 696 in the Camera2BasicFragment file provided in the Google's Camera2Basic sample:
private void createCameraPreviewSession() {
try {
SurfaceTexture texture = mTextureView.getSurfaceTexture();
assert texture != null;
// We configure the size of default buffer to be the size of camera preview we want.
texture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());
// This is the output Surface we need to start preview.
Surface surface = new Surface(texture);
// We set up a CaptureRequest.Builder with the output Surface.
mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
mPreviewRequestBuilder.addTarget(surface);
// Here, we create a CameraCaptureSession for camera preview.
mCameraDevice.createCaptureSession(Arrays.asList(surface, mImageReader.getSurface()),
new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
// The camera is already closed
if (null == mCameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
mCaptureSession = cameraCaptureSession;
try {
//Place here your custom camera settings
// Start displaying the camera preview.
mPreviewRequest = mPreviewRequestBuilder.build();
mCaptureSession.setRepeatingRequest(mPreviewRequest,
mCaptureCallback, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
showToast("Failed");
}
}, null
);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
If you want to do the settings after the preview has started and in runtime just call your disableAutomatics() from the UI or anywhere else and it should work fine.
Note that you don't have to close the older CaptureSession by calling its CaptureSession.close() method as explained in an answer to this other question because the new replaces the older one.
On another hand, I am not sure about setting the exposure time value manually as you did in your question:
mPreviewRequestBuilder.set(CaptureRequest.SENSOR_EXPOSURE_TIME, 1000000L);
because you may get unexpected results varying on different devices. What I know is that doing so is usually discouraged and it's preferred instead to let the camera adjust by its own and then call AE (auto-exposure) lock:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
You can check the CONTROL_AE_LOCK reference here.
But if your code needs a fixed exposure time then it should work.

Is it possible to programmatically take photo with native Camera app on Android device without physically tapping the capture button?

Just as the title say, is it possible to programmatically interact/configure with native Camera app on Android device?
For example, I am trying to have the native camera app to take a picture without having to physically touch the capture button. I'm not wanting to use internal-timer feature in the camera app since this requires physical touch to the button as well.
So simply put, I want to know if it is possible to take a picture without physically touching the capture button.
Thanks in advance for the reply.
is it possible to programmatically interact/configure with native Camera app on Android device?
Not from an ordinary Android app.
Also, please note that there are several thousand Android device models. There are hundreds of different "native Camera app" implementations across those device models, as device manufacturers often implement their own. Your question implies that you think that there is a single "native Camera app", which is not the case.
For an individual device model, or perhaps a closely related family of devices, with a rooted device, you might be able to work something out with simulated user input. However, the myriad of camera apps means that you would need different rules for each app.
Also, if you are only concerned about your own device, you can use the uiautomator test system to control third-party apps, but that requires a connection to your development machine, as the tests are run from there.
It is possible. Just forget about "native Camera app" and use Camera/Camera2 API directly.
Some time ago I tried to make a background service taking pictire periodically, detects face and measure eyes distance to prevent my little dougter watching tab too close, but this was fail because tab camera angle was too narrow to take all her face.
I posted part of this app here (this code use depricated Camera interface. It was replaced by Camera2 interface since API21):
public void onCreate() {
super.onCreate();
mContext = getApplicationContext();
surfaceTexture = new SurfaceTexture(0);
}
public void takePictire() {
Camera cam = openFrontCamera(mContext);
if (cam != null) {
try {
cam.setPreviewTexture(surfaceTexture);
cam.startPreview();
cam.takePicture(null, null, mPicture);
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't take picture!");
}
}
}
private static Camera.PictureCallback mPicture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
BitmapFactory.Options bfo = new BitmapFactory.Options();
bfo.inPreferredConfig = Bitmap.Config.RGB_565;
Bitmap bitmap = BitmapFactory.decodeStream(new ByteArrayInputStream(data), null, bfo);
// Eye distance detection here and saving data
camera.stopPreview();
camera.release();
}
};
/* Check if this device has a camera */
private static Camera openFrontCamera(Context context) {
try {
boolean hasCamera = context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA);
if (hasCamera) {
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e(LOG_TAG, "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't open front camera");
}
return null;
}
Some additional info. To use this code you app should have camera permission in AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
Yes, You can capture image without any user input also in background without frame. Take a look here . Hope this help you!

MediaRecorder not saving files

I've had a look through the other people's encounters with this problem and have not found an adequate solution.
Like them, I followed the tutorial on camera functionality at: http://developer.android.com/guide/topics/media/camera.html.
Everything listed below works perfectly to the point where I assume that the program has recorded video as I had intended. However, upon reviewing the video in the gallery, it has not appeared. I'm confused as there are no IOExceptions or other bugs present when connected for USB debugging. Strangely, upon removing the USB and plugging it in again, whenever that may be, either immediately or at some point days in the future, all of the previously recorded videos appear in the gallery. Clearly there something I have missed or some aspect of recording video that I am not aware of. Would appreciate any help or guidance, thank you.
Pertinent code is as follows, I'll post more if someone needs it.
Camera Activity:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
mCamera = MainActivity.getCameraInstance();
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(mPreview);
button_capture = (Button) findViewById(R.id.button_capture);
button_capture.setOnClickListener(
new View.OnClickListener() {
#Override
public void onClick(View v) {
if (isRecording) {
// stop recording and release camera
mMediaRecorder.stop(); // stop the recording
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
// inform the user that recording has stopped
button_capture.setText("Capture");
isRecording = false;
} else {
// initialize video camera
if (prepareVideoRecorder()) {
// Camera is available and unlocked, MediaRecorder is prepared,
// now you can start recording
mMediaRecorder.start();
// inform the user that recording has started
button_capture.setText("Stop");
isRecording = true;
} else {
// prepare didn't work, release the camera
releaseMediaRecorder();
// inform user
}
}
}
});
}
private boolean prepareVideoRecorder(){
mMediaRecorder = new MediaRecorder();
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
// Step 2: Set sources
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
// Step 4: Set output file
mMediaRecorder.setOutputFile(MediaCapture.getOutputMediaFile(MEDIA_TYPE_VIDEO).toString());
// Step 5: Set the preview output
mMediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
// Step 6: Prepare configured MediaRecorder
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
Media Capture:
public static Uri getOutputMediaFileUri(int type) {
return Uri.fromFile(getOutputMediaFile(MEDIA_TYPE_VIDEO));
}
/**
* Create a File for saving an image or video
*/
public static File getOutputMediaFile(int type) {
// To be safe, you should check that the SDCard is mounted
// using Environment.getExternalStorageState() before doing this.
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_MOVIES), "MyApplication");
// This location works best if you want the created images to be shared
// between applications and persist after your app has been uninstalled.
// Create the storage directory if it does not exist
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("MyApplication", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HH:mm:ss").format(new Date());
File mediaFile;
if (type == MEDIA_TYPE_VIDEO) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"VID_" + timeStamp + ".mp4");
} else {
return null;
}
return mediaFile;
}
}
I too was very perplexed by this behaviour until I realised unplugging the USB was causing the files to appear. This appears to be a bug in Android, possibly related to this one :
https://code.google.com/p/android/issues/detail?id=195362
Clearly, the file has been written correctly, but for some reason the new file is not visible. In any case, the solution/workaround offered in the link above is to force a media scan of your file. e.g.
MediaScannerConnection.scanFile(getContext(), new String[]{this.mediaFile.getAbsolutePath()}, null, null);
Where 'mediaFile' is the file your mediaRecorder has just finished writing. With this in place I do not need to unplug the USB cable to see the file and it appears immediately after recording has finished.
I am running Android 5.0.2 on a Samsung Galaxy A5. This feels more of a workaround than a fix and I can't be sure it will work on other devices or Android versions, but I hope it helps someone.

android camera click continous shots

I am trying to make camera application which takes 3 continuous shots.
i have tried to call "takePicture" several times by putting it in a loop.
but no success.
please help on this matter.
a little help will be appreciated.
You never should call PictureCallback.onPictureTaken() from your code; this callback receives data from the system when it is ready, as response to Camera.takePicture().
The latter call will only succeed if the camera is opened and preview is started. Therefore, simply calling Camera.takePicture() in a loop will not work (see e.g. Android 2.3.1 Camera takePicture() Multiple images with one button click). The correct way to handle this is to keep a counter of shots processed in your onPictureTaken(), and if it is less than 3, then restart camera preview and issue (synchroneously) another Camera.takePicture(). After this, onPictureTaken() should return, to allow processing of the next captured frame.
I use it like this when doing a PhotoBurst. It is also handling the FRameLayout holding the preview to start the PhotoBurst:
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
Parameters param = camera.getParameters();
param.setPictureSize(640, 480);
camera.setParameters(param);
// Or write to sdcard
outStream = new FileOutputStream(String.format(
Environment.getExternalStorageDirectory().getPath()+"/foto%d.jpg",
System.currentTimeMillis()));
outStream.write(data);
outStream.close();
sendBroadcast(new Intent(Intent.ACTION_MEDIA_MOUNTED,
Uri.fromFile(Environment.getExternalStorageDirectory())));
Log.i(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpg");
try {
stillCount++;
camera.startPreview();
if (stillCount < 10) {
preview.mCamera.takePicture(shutterCallback, rawCallback,
jpegCallback);
if (stillCount == 9) {
frameLayout.setClickable(true);
}
} else {
stillCount = 0;
takePictureButton.setEnabled(true);
frameLayout.setClickable(true);
}
} catch (Exception e) {
Log.d(TAG, "Error starting preview: " + e.toString());
}
}
};
I got the solution.
i was calling mCamera.startPreview(); out of my loop.
preview is must to take shots, and not including mCamera.startPreview(); was blocking my execution.

Categories

Resources