Android camera2 preview image disorder when saved using ImageReader - java

I am taking a series of pictures using Android Camera2 API for real time pose estimation and environment reconstruction (the SLAM problem). Currently I simply save all of these pictures in my SD card for off-line processing.
I setup the processing pipeline according to google's Camera2Basic using a TextureView as well as an ImageReader, where they are both set as target surfaces for a repeat preview request.
mButton.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if(mIsShooting){
try {
mCaptureSession.stopRepeating();
mPreviewRequestBuilder.removeTarget(mImageReader.getSurface());
mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
mIsShooting = false;
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
else{
try {
mCaptureSession.stopRepeating();
mPreviewRequestBuilder.addTarget(mImageReader.getSurface());
mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), mCaptureCallback, mBackgroundHandler);
mIsShooting = true;
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
}
});
The ImageReader is added/removed when pressing the button. The ImageReader's OnImageAvailableListener is implemented as follow:
private ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image img = reader.acquireLatestImage();
if(null == img){
return;
}
if(img.getTimestamp() <= mLatestFrameTime){
Log.i(Tag, "disorder detected!");
return;
}
mLatestFrameTime = img.getTimestamp();
ImageSaver saver = new ImageSaver(img, img.getTimestamp());
saver.run();
}
};
I use acquireLatestImage (with buffer size set to 2) to discard old frames and have also checked the image's timestamp to make sure they are monotonously increasing.
The reader does receive images at an acceptable rate (about 25fps). However a closer look at the saved image sequence show they are not
always saved in chronological order.
The following pictures come from a long sequence shot by the program (sorry for not being able to post pictures directly :( ):
Image 1:
Image 2:
Image 3:
Such disorder does not occur very often but they can occur any time and seems not to be an initialization problem. I suppose it has something to do with the ImageReader's buffer size as with larger buffer less "flash backs" are occurred. Does anyone have the same problem?

I finally find that such disorder disappears when setting ImageReader's format to be YUV_420_888 in its constructor. Originally I set this field as JPEG.
Using JPEG format incurs not only large processing delay but also disorder. I guess the conversion from image sensor data to desired format utilizes other hardware such as DSP or GPU which does not guarantee chronological order.

Are you using TEMPLATE_STILL_CAPTURE for the capture requests when you enable the ImageReader, or just TEMPLATE_PREVIEW? What devices are you seeing issues with?
If you're using STILL_CAPTURE, make sure you check if the device supports the ENABLE_ZSL flag, and set it to false. When it is set to true (generally the default on devices that support it, for the STILL_CAPTURE template), images may be returned out of order since there's a zero-shutter-lag queue in place within the camera device.

Related

Decode h264 video to java.awt.image.BufferedImage in java

I am trying to make an AirPlay server in java with this library. I am able to start the server and connect to it and I am getting video input, however the input is in h264 format and I tried decoding it with JCodec but it always says I need an sps/pps and I don't know how to create/find this with just a byte[]. This is the onVideo method which is pretty much just copy-pasted from some websites:
#Override
public void onVideo(byte[] video) {
try {
videoFileChannel.write(ByteBuffer.wrap(video));
ByteBuffer bb = ByteBuffer.wrap(video);
H264Decoder decoder = new H264Decoder();
decoder.addSps(List.of(ByteBuffer.wrap(video)));
Picture out = Picture.create(1920, 1088, ColorSpace.YUV420);
var real = decoder.decodeFrame(bb, out.getData());
// decoder.decodeFrame prints "[WARN] . (:0): Skipping frame as no SPS/PPS have been seen so far..." in console and returns null => NullPointer in next line
var img = AWTUtil.toBufferedImage(real.createCompatible());
// ...
} catch (IOException e) {
e.printStackTrace();
}
}
Edit: I've uploaded a ("working") version to github, but the decoded image is discolored and doesn't update all pixels so when something is on the screen and the frame changes, that something can still be on the image.

Is it possible to reduce quality of video file before uploading via android?

I am currently trying to reduce the quality of videos and audio before uploading to and online cloud database. Below is the code I have been using to record videos.
recordVideoIntent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 0);
Changing the 0 to 1 in EXTRA_VIDEO_QUALITY will increase the quality and vice versa, but the file is still too large to download if it a 30 second or more video.
private void RecordVideoMode() {
Intent recordVideoIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
// Ensure that there's a camera activity to handle the intent
if (recordVideoIntent.resolveActivity(getPackageManager()) != null) {
videoFile = createVideoFile();
// Continue only if the File was successfully created
if (videoFile != null) {
videoURI = FileProvider.getUriForFile(this,
"com.example.android.fileprovider",
videoFile);
recordVideoIntent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 0);
recordVideoIntent.putExtra(MediaStore.EXTRA_OUTPUT, videoURI);
startActivityForResult(recordVideoIntent, REQUEST_VIDEO_CAPTURE);
}
}
}
Any help is very much appreciated!
You can go with this two methods :
Encode it to a lower bit rate and/or lower resolution. Have a look here:
Is it possible to compress video on Android?
Try to zip/compress it. Have a look here:
http://www.jondev.net/articles/Zipping_Files_with_Android_%28Programmatically%29

Process hosting the camera service has died unexpectedly

I have tried everything and I don't find a reason for why my Camera app is throwing me a dead service exception.
Here is the case. I'm using a HDR jni library, which I already check and it works fine, It's not a memory lead of native memory, and it's not a jni problem. So, the problem must to be in my code:
I'm just waiting to the CaptureResult to return me a AE_CONVERGED_STATE to check if the sensor already take the correct exposure and then I call my method:
Log.performanceEnd("YUV capture");
Log.d(TAG, "[onImageAvailable] YUV capture, mBurstCount: " + mBurstCount);
Image image = imageReader.acquireNextImage();
if (mBackgroundHandler != null) {
mBackgroundHandler.post(new YuvCopy(image, mBurstCount));
}
mBurstCount++;
if (mBurstState == BURST_STATE_HDR) {
switch (mBurstCount) {
case 1:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, HDR_EXPOSURE_COMPENSATION_VALUE_HIGH);
break;
case 2:
mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_EXPOSURE_COMPENSATION, HDR_EXPOSURE_COMPENSATION_VALUE_LOW);
break;
case 3:
//Restore exposure compensation value
mCaptureCallback = mPhotoCaptureCallback;
mSettingsManager.setExposureCompensation(mPreviewRequestBuilder);
mActivity.runOnUiThread(new Runnable() {
#Override
public void run() {
onPictureCaptured();
}
});
unlockFocus();
break;
}
if (mBurstCount != 3) {
updatePreviewSession();
}
//Finish HDR session
if (mBurstCount < YUV_BURST_LIMIT) mHdrState = STATE_PICTURE_TAKEN;
}
Here is my YUV method:
/**
* Transform YUV420 to NV21 readable frames
*/
private class YuvCopy implements Runnable {
private final Image mImage;
private final int mPictureIndex;
public YuvCopy(Image image, int index) {
mImage = image;
mPictureIndex = index;
}
#Override
public void run() {
if (mImage != null) {
if (mImage.getWidth() * mImage.getHeight() > 0) {
Image.Plane[] planes = mImage.getPlanes();
long startCopy = System.currentTimeMillis();
int width = mImage.getWidth();
int height = mImage.getHeight();
int ySize = width * height;
ByteBuffer yBuffer = mImage.getPlanes()[0].getBuffer();
ByteBuffer uvBuffer = mImage.getPlanes()[1].getBuffer();
ByteBuffer vuBuffer = mImage.getPlanes()[2].getBuffer();
byte[] mData = new byte[ySize + (ySize / 2)];
yBuffer.get(mData, 0, ySize);
vuBuffer.get(mData, ySize, (ySize / 2) - 1);
mData[mData.length - 1] = uvBuffer.get(uvBuffer.capacity() - 1);
mImage.close();
mHdrCaptureArray[mPictureIndex] = mData;
Log.i(TAG, "[YuvCopy|run] Time to Copy data: " + (System.currentTimeMillis() - startCopy) + "ms");
if (mPictureIndex == YUV_BURST_LIMIT - 1) {
startHdrProcessing();
} else {
mImage.close();
}
}
}
}
I pick a total of three photos and then I call my merge method of my JNI library. I tried to comment all the jni code and it still happening, so I think that possibly the problem must to be here, in my YUV method or maybe in the Burst HDR call.
Finally here is my log error when it happends:
01-01 12:30:27.531 21945-21957/com.myCamera W/AudioSystem: AudioFlinger server died!
01-01 12:30:27.532 21945-22038/com.myCamera W/AudioSystem: AudioPolicyService server died!
1-01 12:30:27.903 21945-21978/com.myCamera I/CameraManagerGlobal: Connecting to camera service
01-01 12:30:27.903 21945-21978/com.myCamera E/CameraManagerGlobal: Camera service is unavailable
01-01 12:30:27.903 21945-21978/com.myCamera W/System.err: android.hardware.camera2.CameraAccessException: Camera service is currently unavailable
01-01 12:30:29.103 21945-21945/com.myCamera W/System.err: android.hardware.camera2.CameraAccessException: Process hosting the camera service has died unexpectedly
Sometimes it take just 2 photos, and sometimes 300, but in the end, it still happening. Also, a lot of times all my device is almost dead and anything work's really fine, so I need to reboot my phone.
Finally the problem was caused because I had a wrong configuration of my ImageReaders, depending of the Hardware level of the Phone, the camera can allow different types of imageReaders with different sizes for each one.
For example, a INFO_SUPPORTED_HARDWARE_LEVEL == FULL doesn't support a JPEG image reader configurated to the max size of the device and another one with YUV format over the preview size in that moment. Anyway, sometimes it can work, and sometimes fail.
If an application tries to create a session using a set of targets that exceed the limits described in the below tables, one of three possibilities may occur. First, the session may be successfully created and work normally. Second, the session may be successfully created, but the camera device won't meet the frame rate guarantees as described in getOutputMinFrameDuration(int, Size). Or third, if the output set cannot be used at all, session creation will fail entirely, with onConfigureFailed(CameraCaptureSession) being invoked.
Quote from: https://developer.android.com/reference/android/hardware/camera2/CameraDevice.html
That means that my device can't have a YUV image reader configurated to 4608x3456 size when my JPEG imageReader is configurated to the same size too. It can only support my preview size(1920x1080). You can check all the possible configurations in this link.

OpenCV taking weird pictures - Java

I got this headache problem and I can't seem to fix the issue. What I am doing I have a machine that a computer is hooked up too and when ever a condition is true it will take a picture. But the issue is when it does take a picture it sometimes weird, View below. I'v tried inverting the picture but not everything is backward. I looked everywhere... nothing helped me out. I tried many different sample codes they either don't work or still have this issue.
Normal picture:
http://imgur.com/ve4bp9M
Weird Picture:
http://imgur.com/5Z46oPz
public Mat getCapture(){
if(camera==null || !camera.isOpened()){
camera = new VideoCapture(0);
setCameraValues();
try {
Thread.sleep(1000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
Mat m = null;
if(!camera.isOpened()){
System.out.println("Error");
}
else {
m = new Mat();
while(true){
camera.read(m);
if (!m.empty()){
//Mat file is not empty
break;
}
}
}
camera.release();
return m;
}
Below is sets the camera's settings the focus, zoom, brightness ect.
public void setCameraValues()
{
this.camera.set(28, ((Integer)this.values.get(0)).intValue());
this.camera.set(27, ((Integer)this.values.get(1)).intValue());
this.camera.set(10, ((Integer)this.values.get(2)).intValue());
this.camera.set(11, ((Integer)this.values.get(3)).intValue());
this.camera.set(12, ((Integer)this.values.get(4)).intValue());
this.camera.set(15, ((Integer)this.values.get(5)).intValue());
this.camera.set(20, ((Integer)this.values.get(6)).intValue());
this.camera.set(33, ((Integer)this.values.get(7)).intValue());
this.camera.set(34, ((Integer)this.values.get(8)).intValue());
this.camera.set(3, ((Integer)this.values.get(9)).intValue());
this.camera.set(4, ((Integer)this.values.get(10)).intValue());
}
EDIT:
The webcam I am using is Microsoft LifeCam HD

J2ME - using javax.microedition.amms.control.camera.CameraControl; is it possible to disable shutter sound?

In my Blackberry app I've implemented the camera and would like to replace the default shutter sound with my own. I figured I could do this by silencing the default camera sound by using the method enableShutterFeedback(false) then playing my own sound, or playing my sound immediately before the camera is activated.
private void initializeCamera()
{
try
{
// Create a player for the Blackberry's camera
Player player = Manager.createPlayer( "capture://video" );
// Set the player to the REALIZED state (see Player javadoc)
player.realize();
// Grab the video control and set it to the current display
_videoControl = (VideoControl)player.getControl( "VideoControl" );
if (_videoControl != null)
{
// Create the video field as a GUI primitive (as opposed to a
// direct video, which can only be used on platforms with
// LCDUI support.)
_videoField = (Field) _videoControl.initDisplayMode (VideoControl.USE_GUI_PRIMITIVE, "net.rim.device.api.ui.Field");
_videoControl.setDisplayFullScreen(true);
_videoControl.setVisible(false);
}
cc = (CameraControl)player.getControl("CameraControl");
cc.enableShutterFeedback(false);
// Set the player to the STARTED state (see Player javadoc)
player.start();
}
catch(Exception e)
{
MyApp.errorDialog("ERROR " + e.getClass() + ": " + e.getMessage());
}
}
This results in a Null pointer exception but can't figure out what's causing it, the camera's video doesn't get displayed. If I remove the CameraControl code in bold then the camera's video is shown. Any ideas what I should try to get rid of the shutter sound? I tried VolumeControl in place of CameraControl, same results, null pointer.
The CameraControl code gives a NPE because player.getControl returns null, and it does so because the string param is not correct. Try this one:
CameraControl control = (CameraControl) p.getControl("javax.microedition.amms.control.camera.CameraControl");

Categories

Resources