Most of the questions I found on the forum are related to recording processed image. Like here:
Recording Live OpenCV Processing on Android
However, my goal is to record the camera view while the processed frames are displayed on the screen. I tried recording with VideoWriter provided by OpenCV, but the FPS dropped twice while recording. I thought about adding MediaRecorder to JavaCameraView, but then I have to unlock the camera, which disables the preview.
public void startRecord() {
mCamera.unlock();
mediaRecorder = new MediaRecorder();
mediaRecorder.setCamera(mCamera);
mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
mediaRecorder.setOutputFile("/storage/emulated/0/Android/data/com.ms.carcam/files/video1.mp4");
mediaRecorder.setVideoSize(1280,720);
try {
mediaRecorder.prepare();
} catch (IOException e) {
e.printStackTrace();
}
mediaRecorder.start();
}
How can I achive this?
Related
I am facing an issue that on button touch i start recording audio in my android application, but when i plays the recorded audio it is missing some duration of the recorded audio.
Here is my code snippet to start recording voice on a button touch given by:
public void start() {
myRecorder = new MediaRecorder();
myRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myRecorder.setOutputFile(outputFile);
try {
myRecorder.prepare();
} catch (IOException e) {
e.printStackTrace();
}
myRecorder.start();
text.setText("Recording point: Recording...");
}
On sensing touch event i starts calling this start(); function.
Can any body tell me any alternative solution? I faced this issue on my glaxy s3 android device.
myRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
myRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
myRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
myRecorder.setOutputFile(outputFile);
The piece of code above takes some time, hence the delay between the user pressing the button (start() being called) and the recorder actually starting the recording (myRecorder.start()).
You should initialize the recorder and get it ready before the button is pressed. Just move the above piece of code elsewhere, e.g. onCreate(). Since I don't know the context of this code, I cannot tell you exactly where.
Just as the title say, is it possible to programmatically interact/configure with native Camera app on Android device?
For example, I am trying to have the native camera app to take a picture without having to physically touch the capture button. I'm not wanting to use internal-timer feature in the camera app since this requires physical touch to the button as well.
So simply put, I want to know if it is possible to take a picture without physically touching the capture button.
Thanks in advance for the reply.
is it possible to programmatically interact/configure with native Camera app on Android device?
Not from an ordinary Android app.
Also, please note that there are several thousand Android device models. There are hundreds of different "native Camera app" implementations across those device models, as device manufacturers often implement their own. Your question implies that you think that there is a single "native Camera app", which is not the case.
For an individual device model, or perhaps a closely related family of devices, with a rooted device, you might be able to work something out with simulated user input. However, the myriad of camera apps means that you would need different rules for each app.
Also, if you are only concerned about your own device, you can use the uiautomator test system to control third-party apps, but that requires a connection to your development machine, as the tests are run from there.
It is possible. Just forget about "native Camera app" and use Camera/Camera2 API directly.
Some time ago I tried to make a background service taking pictire periodically, detects face and measure eyes distance to prevent my little dougter watching tab too close, but this was fail because tab camera angle was too narrow to take all her face.
I posted part of this app here (this code use depricated Camera interface. It was replaced by Camera2 interface since API21):
public void onCreate() {
super.onCreate();
mContext = getApplicationContext();
surfaceTexture = new SurfaceTexture(0);
}
public void takePictire() {
Camera cam = openFrontCamera(mContext);
if (cam != null) {
try {
cam.setPreviewTexture(surfaceTexture);
cam.startPreview();
cam.takePicture(null, null, mPicture);
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't take picture!");
}
}
}
private static Camera.PictureCallback mPicture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
BitmapFactory.Options bfo = new BitmapFactory.Options();
bfo.inPreferredConfig = Bitmap.Config.RGB_565;
Bitmap bitmap = BitmapFactory.decodeStream(new ByteArrayInputStream(data), null, bfo);
// Eye distance detection here and saving data
camera.stopPreview();
camera.release();
}
};
/* Check if this device has a camera */
private static Camera openFrontCamera(Context context) {
try {
boolean hasCamera = context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA);
if (hasCamera) {
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e(LOG_TAG, "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't open front camera");
}
return null;
}
Some additional info. To use this code you app should have camera permission in AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
Yes, You can capture image without any user input also in background without frame. Take a look here . Hope this help you!
I have an Arduino which broadcasts via Bluetooth, input from a mic.
I want to connect the phone so that it'll record and save the input from the Arduino mic via Bluetooth.
When I run the following code, I have a few issues.
I can't seem to find the file I saved.
File file=new File(mFilePath2,"test.txt");
On Logcat I'm getting the following errors when I run Bluetooth_Test()
ACDB-LOADER - Error: ACDB AudProc vol returned = -19
MediaPlayer - Error (1, -1004)
When I run stop() I get:
MPEG4Writer - Stop() called but track is not started
MediaPlayer-JNI QCMediaPlayer mediaplayer NOT present
MediaPlayer - Should have subtitle controller already set
I'm not sure what is happening and I'm not sure how to figure it out.
References:
Sony - Use Bluetooth for audio I/O
Media Player called in state 0, error (-38,0) (it solved one error, not listed above)
Code:
public void Bluetooth_Test (){
Toast.makeText(getActivity(), "weee", Toast.LENGTH_LONG).show();
maudioManager = (AudioManager) getActivity().getSystemService(getActivity().AUDIO_SERVICE);
// Switch to headset
maudioManager.setMode(AudioManager.MODE_IN_CALL); // to use headset's I/O and not phone's
// Start audio I/O operation (in background)
maudioManager.startBluetoothSco();
mRecorder = new MediaRecorder();
mRecorder.setAudioSource(MediaRecorder.AudioSource.MIC); // set source to current mic (should be Bluetooth)
mFilePath = Environment.getExternalStorageDirectory().getAbsolutePath();
String mFilePath2 = mFilePath;
mFilePath += "/youraudiofile.mp3";
File file=new File(mFilePath2,"test.txt");
// Set file extension for the recorded audio file
mRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// Set a file path for the recorded audio file
mRecorder.setOutputFile(mFilePath);
// Set encoding of the audio
mRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
try {
mRecorder.prepare();
}
catch (IOException e){
Log.e("Starting mRecorder", "IO Exception");
}
// Start recording
mRecorder.start();
}
public void stop () {
mRecorder.stop();
mRecorder.reset();
mRecorder.release();
mRecorder = null;
final MediaPlayer mPlayer = new MediaPlayer();
try {
mPlayer.setDataSource(mFilePath);
mPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
#Override
public void onPrepared(MediaPlayer mp) {
mPlayer.start(); // Start playing audio file
}
});
// Audio file to be played
mPlayer.prepareAsync();
} catch (IOException e){
Log.e("Stop Function", "IO Exception");
}
}
I've had a look through the other people's encounters with this problem and have not found an adequate solution.
Like them, I followed the tutorial on camera functionality at: http://developer.android.com/guide/topics/media/camera.html.
Everything listed below works perfectly to the point where I assume that the program has recorded video as I had intended. However, upon reviewing the video in the gallery, it has not appeared. I'm confused as there are no IOExceptions or other bugs present when connected for USB debugging. Strangely, upon removing the USB and plugging it in again, whenever that may be, either immediately or at some point days in the future, all of the previously recorded videos appear in the gallery. Clearly there something I have missed or some aspect of recording video that I am not aware of. Would appreciate any help or guidance, thank you.
Pertinent code is as follows, I'll post more if someone needs it.
Camera Activity:
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
mCamera = MainActivity.getCameraInstance();
// Create our Preview view and set it as the content of our activity.
mPreview = new CameraPreview(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.camera_preview);
preview.addView(mPreview);
button_capture = (Button) findViewById(R.id.button_capture);
button_capture.setOnClickListener(
new View.OnClickListener() {
#Override
public void onClick(View v) {
if (isRecording) {
// stop recording and release camera
mMediaRecorder.stop(); // stop the recording
releaseMediaRecorder(); // release the MediaRecorder object
mCamera.lock(); // take camera access back from MediaRecorder
// inform the user that recording has stopped
button_capture.setText("Capture");
isRecording = false;
} else {
// initialize video camera
if (prepareVideoRecorder()) {
// Camera is available and unlocked, MediaRecorder is prepared,
// now you can start recording
mMediaRecorder.start();
// inform the user that recording has started
button_capture.setText("Stop");
isRecording = true;
} else {
// prepare didn't work, release the camera
releaseMediaRecorder();
// inform user
}
}
}
});
}
private boolean prepareVideoRecorder(){
mMediaRecorder = new MediaRecorder();
// Step 1: Unlock and set camera to MediaRecorder
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
// Step 2: Set sources
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
// Step 3: Set a CamcorderProfile (requires API Level 8 or higher)
mMediaRecorder.setProfile(CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH));
// Step 4: Set output file
mMediaRecorder.setOutputFile(MediaCapture.getOutputMediaFile(MEDIA_TYPE_VIDEO).toString());
// Step 5: Set the preview output
mMediaRecorder.setPreviewDisplay(mPreview.getHolder().getSurface());
// Step 6: Prepare configured MediaRecorder
try {
mMediaRecorder.prepare();
} catch (IllegalStateException e) {
Log.d(TAG, "IllegalStateException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
} catch (IOException e) {
Log.d(TAG, "IOException preparing MediaRecorder: " + e.getMessage());
releaseMediaRecorder();
return false;
}
return true;
}
Media Capture:
public static Uri getOutputMediaFileUri(int type) {
return Uri.fromFile(getOutputMediaFile(MEDIA_TYPE_VIDEO));
}
/**
* Create a File for saving an image or video
*/
public static File getOutputMediaFile(int type) {
// To be safe, you should check that the SDCard is mounted
// using Environment.getExternalStorageState() before doing this.
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_MOVIES), "MyApplication");
// This location works best if you want the created images to be shared
// between applications and persist after your app has been uninstalled.
// Create the storage directory if it does not exist
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("MyApplication", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HH:mm:ss").format(new Date());
File mediaFile;
if (type == MEDIA_TYPE_VIDEO) {
mediaFile = new File(mediaStorageDir.getPath() + File.separator +
"VID_" + timeStamp + ".mp4");
} else {
return null;
}
return mediaFile;
}
}
I too was very perplexed by this behaviour until I realised unplugging the USB was causing the files to appear. This appears to be a bug in Android, possibly related to this one :
https://code.google.com/p/android/issues/detail?id=195362
Clearly, the file has been written correctly, but for some reason the new file is not visible. In any case, the solution/workaround offered in the link above is to force a media scan of your file. e.g.
MediaScannerConnection.scanFile(getContext(), new String[]{this.mediaFile.getAbsolutePath()}, null, null);
Where 'mediaFile' is the file your mediaRecorder has just finished writing. With this in place I do not need to unplug the USB cable to see the file and it appears immediately after recording has finished.
I am running Android 5.0.2 on a Samsung Galaxy A5. This feels more of a workaround than a fix and I can't be sure it will work on other devices or Android versions, but I hope it helps someone.
I'm trying to add video recording capability to my app using MediaRecorder in Android, but the resulting video looks corrupt with green lines (audio is fine). The following code is what I use to initialize the MediaRecorder object:
mMediaRecorder = new MediaRecorder();
mCamera.unlock();
mMediaRecorder.setCamera(mCamera);
mMediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
mMediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
mMediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
mMediaRecorder.setOutputFile(Utility.CAPTURE_VIDEO_FILENAME);
mMediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
mMediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H263);
mMediaRecorder.setMaxDuration(60000);
mMediaRecorder.setVideoFrameRate(20);
mMediaRecorder.setMaxFileSize(5000000);
mMediaRecorder.setVideoSize(352, 288);
mMediaRecorder.setPreviewDisplay(mPreview.mHolder.getSurface());
mMediaRecorder.prepare();
mMediaRecorder.start();
I've already looked at the suggestions here and here, but they don't seem to help my cause. I do think, however, that it might have something to do with incorrect video size. So my question is this: is there any good way to get compatible video sizes when using API level 7? As far as I can tell I can use CamcorderProfile if I'm in API level 8, but nothing in 7.
Video sizes in a device is equal to preview sizes. You have to first check whether video size you setting is available or not. Video sizes in different devices may be diffrent.so,first check available preview sizes using getSupportedPreviewSizes () and then set video size.if video size is incorrect green lines will come.
You can change these option and see how the qulity varies:
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
Following code will anyway record a video for you:
MediaRecorder recorder;
private void initRecorder() {
recorder = new MediaRecorder();
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
if (this.getResources().getConfiguration().orientation != Configuration.ORIENTATION_LANDSCAPE) {
recorder.setOrientationHint(90);//plays the video correctly
}else{
recorder.setOrientationHint(180);
}
recorder.setOutputFile("/sdcard/MediaAppVideos/"+randomNum+".mp4");
}
private void prepareRecorder() {
recorder.setPreviewDisplay(holder.getSurface());
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}
public void surfaceDestroyed(SurfaceHolder holder) {
try {
if (recording) {
recorder.stop();
recording = false;
}
recorder.release();
// finish();
} catch (Exception e) {
}
}
Check your code for the setRecordingHint(true);
http://developer.android.com/reference/android/hardware/Camera.Parameters.html#setRecordingHint(boolean)
The setting of this parameter cause this green glitches in the video on few devices.
For me green patches happened only on one device for 1920x1080. For higher or lower resolutions, recorded video is okay. When I set preview size to be same size as video size, I don't see any green strips