I'm trying to capture an image via camera2 API, and i'm able to achieve it.
But the main problem is when i try to open that image on mobile phone it shows an error like "unsupported file format". Here's my code so far.
SimpleDateFormat s = new SimpleDateFormat("ddhhmmss");
String format = s.format(new Date());
private File file = new File(Environment.getExternalStoragePublicDirectory(DIRECTORY_PICTURES)+"/"+"image_" + format + ".jpg");
protected void takePicture() {
if(null == cameraDevice) {
Log.e("tag", "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(CaptureImage.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
Related
I hope you can fix my problem:
I have got a textureview to takes photos but when he capture photos, the file who need to contain the picture is empty. I don't know where and what is the problem. I just think that the problem is when the file is created. Thank you to help me.
This is a part of my code :
private void takePicture() throws CameraAccessException, IOException {
if(cameraDevice == null){
return;
}
CameraManager manager = (CameraManager) Objects.requireNonNull(getContext()).getSystemService(Context.CAMERA_SERVICE);
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
int width = 640;
int height = 480;
if(jpegSizes!=null && jpegSizes.length>0){
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width,height, ImageFormat.JPEG, 1);
List<Surface> outputSurface = new ArrayList<>(2);
outputSurface.add(reader.getSurface());
outputSurface.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE,CameraMetadata.CONTROL_MODE_AUTO);
int rotation = Objects.requireNonNull(getActivity()).getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION,ORIENTATIONS.get(rotation));
// First I get the path to gallery and crate new Album to my app
File file = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOCUMENTS);
mImageFolder = new File(file, "Fluico");
if (!mImageFolder.exists()) {
if (!mImageFolder.mkdirs()) {
Log.d("Fluicophoto", "failed to create directory");
}
}
/*Second I cut mFile = new File(getActivity().getExternalFilesDir(null), "pic.jpg");
from onActivityCreated and add here with the new path from my Album*/
#SuppressLint("SimpleDateFormat") String timestamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String prepend = "IMAGE_" + timestamp + "_";
file = File.createTempFile(prepend, ".jpg", mImageFolder);
Toast.makeText(getContext(), "file need to be save", Toast.LENGTH_SHORT).show();
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
try {
save(bytes);
} finally {
image.close();
}
}
};
reader.setOnImageAvailableListener(readerListener,mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(getContext(), "saved", Toast.LENGTH_SHORT).show();
try{
createCameraPreview();
}catch (CameraAccessException e){
Toast.makeText(getContext(), "capture not comleted", Toast.LENGTH_SHORT).show();
e.printStackTrace();
}
}
};
cameraDevice.createCaptureSession(outputSurface, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured( CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
Toast.makeText(getContext(), "capture not configured", Toast.LENGTH_SHORT).show();
}
}
#Override
public void onConfigureFailed( CameraCaptureSession session) {
}
},mBackgroundHandler);
}
private void save(byte[] bytes) {
OutputStream outputStream = null;
try {
outputStream = new FileOutputStream(file);
outputStream.write(bytes);
outputStream.close();
Toast.makeText(getContext(), "it's good", Toast.LENGTH_SHORT).show();
}catch (IOException e){
Toast.makeText(getContext(), "file not found", Toast.LENGTH_SHORT).show();
}
}
Try with below code:
File myDir;
String fname;
String id = "123456";
int count = 0;
String root = Environment.getExternalStorageDirectory().toString();
myDir = new File(root + "/PhotoApp/" + id);
myDir.mkdirs();
count++;
counter.setText(String.valueOf(count));
fname = "" + id + "-" + count + ".jpg";
File file = new File(myDir, fname);
if (file.exists()) file.delete();
This code will create a folder PhotoApp and inside the folder, all the images will store.
You can get full code with the camera in my GitHub:
Link: https://github.com/abhishekhzb/quick_camera
This question already has answers here:
How might I add a watermark effect to an image in Android?
(8 answers)
Closed 4 years ago.
I am using camera2 API in my android app and i want to make some png watermark on images, please if its possible help me to make it thank you in advance.
in other words, I want to make png watermark on captured photos is it possible?
in my opinion, camera2 API must have some method to make it but I havnt arrange of finding something on the internet there is not a lot of documentation, so i hope you know something about it.
private void takePicture() {
if(cameraDevice == null)
return;
CameraManager manager = (CameraManager)getSystemService(Context.CAMERA_SERVICE);
try{
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if(characteristics != null)
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)
.getOutputSizes(ImageFormat.JPEG);
//Capture image with custom size
int width = 1000;
int height = 1000;
/*
if(jpegSizes != null && jpegSizes.length > 0)
{
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
*/
final ImageReader reader = ImageReader.newInstance(width,height,ImageFormat.JPEG,1);
List<Surface> outputSurface = new ArrayList<>(2);
outputSurface.add(reader.getSurface());
outputSurface.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
//Check orientation base on device
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION,ORIENTATIONS.get(rotation));
file = new File(Environment.getExternalStorageDirectory()+"/"+UUID.randomUUID().toString()+".jpg");
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader imageReader) {
Image image = null;
try{
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
finally {
{
if(image != null)
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream outputStream = null;
try{
outputStream = new FileOutputStream(file);
outputStream.write(bytes);
}finally {
if(outputStream != null)
outputStream.close();
}
}
};
reader.setOnImageAvailableListener(readerListener,mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(MainActivity.this, "Saved "+file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurface, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
try{
cameraCaptureSession.capture(captureBuilder.build(),captureListener,mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
}
},mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Custom image manipulation is unrelated to any Camera API, and therefore there are no methods to apply any type of watermark.
The solution is to load the photo as a bitmap, render the watermark through a canvas initialized with the bitmap, and save it again.
Some considerations you need to take into account:
You will need to correctly manage the OutOfMemoryError if loading a
large image. Here some tips to avoid any problem.
If saving again to JPEG, you may want to copy the original EXIF
details to the new modified image.
You should perform all this operation in a background thread to avoid blocking the UI, as the operation definitely will be time consuming.
I'm working in a remote ScreenRecord in real time based in this code.
With this code of reference i'm able to record screen of my smartphone with success and saving .mp4 file in some folder previous defined.
But to my project, i need that this content be sent to server (similar to Team View for android).
My last attempt to sent this content to server side was like this:
OutputStream os = clientSocket.getOutputStream();
BufferedOutputStream bos = new BufferedOutputStream (os);
ObjectOutputStream oos = new ObjectOutputStream(bos);
oos.writeObject(encodedData);
oos.flush();
bos.flush();
os.flush();
Reference => this question
but comes a error of:
java.io.NotSerializableException
in this line > oos.writeObject(encodedData);
Then, i want know how solve it or some other way to make this task.
Here is complete code:
/////////////////////////////////////// CLIENTSOCKET ///////////////////////////////////////////
Socket clientSocket;
int SERVERPORT = 60;
String SERVER_IP = "192.168.15.6";
class ClientThread implements Runnable {
#Override
public void run() {
try {
InetAddress serverAddr = InetAddress.getByName(SERVER_IP);
clientSocket = new Socket(serverAddr, SERVERPORT);
new Thread(new CommsThread()).start();
} catch (Exception e1) {
System.out.println(e1.toString());
}
}
}
class CommsThread implements Runnable {
#Override
public void run() {
try {
System.out.println("Waiting for server request");
while(clientSocket.isConnected()){
BufferedReader reader = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
PrintWriter out = new PrintWriter(new BufferedWriter(new OutputStreamWriter(clientSocket.getOutputStream())),true);
if (reader.ready()) {
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
if(line != null && !line.trim().isEmpty()) {
if(line.equalsIgnoreCase("screenrecord")){
System.out.println(out.toString());
mMediaProjectionManager = (MediaProjectionManager)getSystemService(android.content.Context.MEDIA_PROJECTION_SERVICE);
Intent permissionIntent = mMediaProjectionManager.createScreenCaptureIntent();
startActivityForResult(permissionIntent, REQUEST_CODE_CAPTURE_PERM);
out.flush();
}
if(line.equalsIgnoreCase("exit")) {
stopRecording();
break;
}
}
}
}
Thread.sleep(100);
}
System.out.println("Shutting down Socket!!");
clientSocket.close();
} catch (Exception e1) {
System.out.println(e1.toString());
}
}
}
////////////////////////////////////// MEDIAPROJECTION /////////////////////////////////////////
private static final int REQUEST_CODE_CAPTURE_PERM = 1234;
private static final String VIDEO_MIME_TYPE = "video/avc";
int VIDEO_WIDTH, VIDEO_HEIGHT;
private boolean mMuxerStarted = false;
private MediaProjectionManager mMediaProjectionManager;
private MediaProjection mMediaProjection;
private Surface mInputSurface;
private MediaMuxer mMuxer;
private MediaCodec mVideoEncoder;
private MediaCodec.BufferInfo mVideoBufferInfo;
private int mTrackIndex = -1;
private final Handler mDrainHandler = new Handler(Looper.getMainLooper());
private Runnable mDrainEncoderRunnable = new Runnable() {
#Override
public void run() {
drainEncoder();
}
};
public void stopRecording(){
releaseEncoders();
}
private void startRecording() {
DisplayManager dm = (DisplayManager)getSystemService(Context.DISPLAY_SERVICE);
Display defaultDisplay = dm.getDisplay(Display.DEFAULT_DISPLAY);
if (defaultDisplay == null) {
throw new RuntimeException("No display found.");
}
DisplayMetrics metrics = getResources().getDisplayMetrics();
int screenWidth = metrics.widthPixels;
int screenHeight = metrics.heightPixels;
int screenDensity = metrics.densityDpi;
VIDEO_WIDTH = screenWidth;
VIDEO_HEIGHT = screenHeight;
prepareVideoEncoder();
mMediaProjection.createVirtualDisplay("Recording Display", screenWidth, screenHeight, screenDensity, 0, mInputSurface, null, null);
drainEncoder();
}
private void prepareVideoEncoder() {
mVideoBufferInfo = new MediaCodec.BufferInfo();
MediaFormat format = MediaFormat.createVideoFormat(VIDEO_MIME_TYPE, VIDEO_WIDTH, VIDEO_HEIGHT);
int frameRate = 30;
format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
format.setInteger(MediaFormat.KEY_BIT_RATE, 6000000);
format.setInteger(MediaFormat.KEY_FRAME_RATE, frameRate);
format.setInteger(MediaFormat.KEY_CAPTURE_RATE, frameRate);
format.setInteger(MediaFormat.KEY_REPEAT_PREVIOUS_FRAME_AFTER, 1000000 / frameRate);
format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1);
format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);
try {
mVideoEncoder = MediaCodec.createEncoderByType(VIDEO_MIME_TYPE);
mVideoEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mInputSurface = mVideoEncoder.createInputSurface();
mVideoEncoder.start();
} catch (IOException e) {
releaseEncoders();
}
}
private boolean drainEncoder() {
mDrainHandler.removeCallbacks(mDrainEncoderRunnable);
while (true) {
int bufferIndex = mVideoEncoder.dequeueOutputBuffer(mVideoBufferInfo, 0);
if (bufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {
break;
} else if (bufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
} else if (bufferIndex < 0) {
} else {
ByteBuffer encodedData = mVideoEncoder.getOutputBuffer(bufferIndex);
if (encodedData == null) {
throw new RuntimeException("couldn't fetch buffer at index " + bufferIndex);
}
if ((mVideoBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
mVideoBufferInfo.size = 0;
}
if (mVideoBufferInfo.size != 0) {
encodedData.position(mVideoBufferInfo.offset);
encodedData.limit(mVideoBufferInfo.offset + mVideoBufferInfo.size);
try {
OutputStream os = clientSocket.getOutputStream();
BufferedOutputStream bos = new BufferedOutputStream (os);
ObjectOutputStream oos = new ObjectOutputStream(bos);
oos.writeObject(encodedData);
oos.flush();
bos.flush();
os.flush();
} catch (IOException e) {
throw new RuntimeException("couldn't send data to server");
}
} else {
}
mVideoEncoder.releaseOutputBuffer(bufferIndex, false);
}
if ((mVideoBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
break;
}
}
mDrainHandler.postDelayed(mDrainEncoderRunnable, 10);
return false;
}
private void releaseEncoders() {
mDrainHandler.removeCallbacks(mDrainEncoderRunnable);
if (mMuxer != null) {
if (mMuxerStarted) {
mMuxer.stop();
}
mMuxer.release();
mMuxer = null;
mMuxerStarted = false;
}
if (mVideoEncoder != null) {
mVideoEncoder.stop();
mVideoEncoder.release();
mVideoEncoder = null;
}
if (mInputSurface != null) {
mInputSurface.release();
mInputSurface = null;
}
if (mMediaProjection != null) {
mMediaProjection.stop();
mMediaProjection = null;
}
mVideoBufferInfo = null;
mDrainEncoderRunnable = null;
mTrackIndex = -1;
}
////////////////////////////////////////////////////////////////////////////////////////////////
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
new Thread(new ClientThread()).start();
}
public void onActivityResult(int requestCode, int resultCode, Intent intent) {
if (REQUEST_CODE_CAPTURE_PERM == requestCode) {
if (resultCode == RESULT_OK) {
mMediaProjection = mMediaProjectionManager.getMediaProjection(resultCode, intent);
startRecording();
} else {
}
}
}
I use Android CAMERA 2 API for the camera2 supporting phones on my app. Works great, but some times it captures 0kb images & gives a full white preview. Shows no errors or warnings on logcat. Below is my camera image capture & save code:
protected void takePictureViaCamera2() {
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
//final File file = new File(Environment.getExternalStorageDirectory()+"/pic.jpg");
final File file = createImageFile();
final String checkPath = file.getParent();
Log.e("checkPath", checkPath);
//photoFile = createImageFile();
//photoFilePath = photoFile.getParent();
//Log.e("photoFilePath", photoFilePath);
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
Log.e("readerListenerFNF_EX", e + "");
} catch (IOException e) {
e.printStackTrace();
Log.e("readerListenerIO_EX", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("readerListener_EX", e + "");
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
//output = new FileOutputStream(photoFile);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(TravelChargesCamera2Activity.this, "Saved***:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
closeCamera();
Thread thread = new Thread() {
#Override
public void run() {
try {
synchronized (this) {
wait(5000);
runOnUiThread(new Runnable() {
#Override
public void run() {
toolbar.setVisibility(View.VISIBLE);
header.setVisibility(View.VISIBLE);
assessorTraveldetails_scroll.setVisibility(View.VISIBLE);
camera2Layout.setVisibility(View.GONE);
try {
if (!file.getParentFile().isDirectory()) {
travelDirectory = new File(checkPath);
Log.e("travelDir==created==", travelDirectory + "");
} else {
travelDirectory = file.getParentFile();
Log.e("travelDir==exists==", travelDirectory + "");
}
files = travelDirectory.listFiles();
Log.e("files", files + "");
// removes old images from view
// prevents display duplication
travelGallery.removeAllViews();
// loop displays the photos captured on Horizontal ScrollView
for (File picFile : files) {
travelGallery.addView(insertPhoto(picFile.getAbsolutePath()));
byte[] byteArray = ImageUtils.fileToByteArray(file);
Log.e("byteArray", byteArray + "");
}
} catch (Exception e) {
e.printStackTrace();
Log.e("file", e + "");
}
}
});
}
} catch (InterruptedException e) {
e.printStackTrace();
Log.e("InterruptedException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("Exception", e + "");
}
}
};
thread.start();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
Log.e("CameraAccessException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("takePictureViaCamera2", e + "");
}
}
private File createImageFile() throws IOException {
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String uniqueId = UUID.randomUUID().toString();
String imageFileName = "CAP_" + timeStamp + "_" + uniqueId;
String bCode = "";
masterMap = (LinkedTreeMap) gson.fromJson(batchMasterObject.getMasterJson(), Object.class);
auditMap = (LinkedTreeMap) masterMap.get("2001");
if ((auditMap.get("batchCode") != null && !auditMap.get("batchCode").equals(""))) {
bCode = auditMap.get("batchCode").toString();
} else if (auditMap.get("cbc") != null && !auditMap.get("cbc").equals("")) {
bCode = auditMap.get("cbc").toString();
}
File storageDir = new File(getExternalFilesDir(Environment.DIRECTORY_PICTURES), "/" + bCode + "/Travel/");
if (!storageDir.exists()) {
storageDir.mkdirs();
}
String checkPath = storageDir.getPath();
Log.e("checkPath", checkPath);
Toast.makeText(getApplicationContext(), checkPath, Toast.LENGTH_LONG).show();
File image = File.createTempFile(
imageFileName, /* prefix */
".jpg", /* suffix */
storageDir /* directory */
);
// Save a file: path for use with ACTION_VIEW intents
mCurrentPhotoPath = storageDir.getAbsolutePath();
return image;
}
Could any one help on this?
NOTE: The camera, captures and stores an image file as intended, but the file size is 0kb. When the image file is opened on phone or pc, it says "Unsupported format"
It looks like you're taking your final JPEG capture as the very first frame.
This means the camera's auto-exposure routines have not had any time to work (since you're grabbing the first frame), so the image will be captured with the initial settings hardcoded into the camera driver. This may work for some scenes, but in others, the image will be far too bright or far too dark.
For best results, you should first let the camera meter for a while, until auto-exposure and auto-focus are converged.
Generally, this can be done by setting up a low-resolution preview output (use a SurfaceTexture for example), and running it for a second or two (or wait until a CaptureResult has an AE_STATE of CONVERGED). Then issue the JPEG capture.
using this url I wrote below code to encode onpreviewframe data to mp4 video and I used a thread to do this job well, but it seems that it doesn't work properly.
private void initCodec() {
String root = Environment.getExternalStorageDirectory().toString();
File myDir = new File(root + "/Vocalist");
if(!myDir.exists()) {
myDir.mkdirs();
}
try {
File file = new File (myDir, "myVideo.mp4");
if(file.exists()){
file.delete();
}
fos = new FileOutputStream(file, false);
} catch (FileNotFoundException e) {
e.printStackTrace();
}try {
mMediaCodec = MediaCodec.createEncoderByType("video/avc");
}
catch (Exception e){
}
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",
320,
240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 500000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mMediaCodec.configure(mediaFormat,
null,
null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
inputBuffers = mMediaCodec.getInputBuffers();
outputBuffers = mMediaCodec.getOutputBuffers();
}
private synchronized void encode(byte[] dataInput)
{
byte[] data = dataInput;
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(data);
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i("tag", "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
System.out.println("buffer info-->" + bufferInfo.offset + "--"
+ bufferInfo.size + "--" + bufferInfo.flags + "--"
+ bufferInfo.presentationTimeUs);
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i("camera", "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
if (mHolder.getSurface() == null) {
return;
}
try {
initCodec();
mCamera.setPreviewDisplay(mHolder);
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(final byte[] bytes, Camera camera) {
if (recording == true) {
if(mThread.isAlive())
encode(bytes);
}
}
});
} catch (Exception e) {
Log.d("TAG", "Error starting camera preview: " + e.getMessage());
}
}
}
public void newOpenCamera() {
if (mThread == null) {
mThread = new CameraHandlerThread();
}
synchronized (mThread) {
mThread.openCamera();
}
}
private static void oldOpenCamera() {
try {
c = Camera.open(1);
Camera.Parameters parameters = c.getParameters();
parameters.set("orientation", "portrait");
parameters.setJpegQuality(100);
parameters.setPreviewFormat(ImageFormat.NV21);
parameters.setPreviewSize(320, 240);
c.setParameters(parameters);
}
catch (RuntimeException e) {
Log.e("camera", "failed to open front camera");
}
}
public CameraHandlerThread mThread = null;
public static class CameraHandlerThread extends HandlerThread {
Handler mHandler = null;
CameraHandlerThread() {
super("CameraHandlerThread");
start();
mHandler = new Handler(getLooper());
}
synchronized void notifyCameraOpened() {
notify();
}
public void openCamera() {
mHandler.post(new Runnable() {
#Override
public void run() {
oldOpenCamera();
notifyCameraOpened();
}
});
}
}
I converted onpreviewframe data to a video but after first second video doesn't play smoothly. what should I do ?
First, you're not forwarding the timing information with the frames:
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0)
So your BufferInfo.presentationTimeUs will always be zero when you dequeue the buffer.
Second, you don't appear to be using MediaMuxer, which means you're just writing raw the raw H.264 stream to a file. This is not ".mp4"; it doesn't include the timing information at all. Many video players don't even know what to do with plain H.264.
Wrapping the file as .mp4, with the frame timing from the camera, should yield better results.
Your code structure appears to be assuming that it can feed one frame of input and get one frame of output, which isn't always the case. You want to keep the input full, and drain the output as it becomes available.
You can find more information and some sample code on bigflake and in Grafika.