I have made a lot of researches about my problem but none helped me to make my code run, the preview of my camera 2 api is working and displayed on a surfaceview, but when I want to save the picture by a button clique, the ImageReader.OnImageAvailableListener never works.
android.util.Size[] jpegSizes = null;
jpegSizes = cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
int width = jpegSizes[0].getWidth();
int height = jpegSizes[0].getHeight();
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(1);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(mCameraSurface_1);
final CaptureRequest.Builder captureBuilder =
mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
captureBuilder.addTarget(reader.getSurface());
reader.setOnImageAvailableListener(readerListener, mHandler);
and I use this code to read the imageavailable, the message log "success" is never showed
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Log.v(TAG,"success" );
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}}
The problem is that you did not build the captureBuilder in order to get a CaptureRequest. You can use a CameraCaptureSession and send the captureBuilder.build() in the capture method for the session. Please see: https://developer.android.com/reference/android/hardware/camera2/CaptureRequest.Builder
Related
I have this code to make screenshots. In the virtual device of android studio this code works perfectly, but in my own device Samsung(with android 8.0) when I do the apk installer the program only take some screenshots. Example I press the button to take 10 screenshots, but the program only save me 2 and the rest doesn't exist. And now I don't have any idea what's the problem.
#SuppressLint("WrongConstant")
private void screenshot() {
Point size = new Point();
displayy.getSize(size);
mWidth = size.x;
mHeight = size.y;
displayy.getMetrics(metrics);
int mDensity = metrics.densityDpi;
Handler handler=new Handler();
mImageReader = ImageReader.newInstance(mWidth, mHeight, PixelFormat.RGBA_8888, 2);
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
displayyy = mMediaProjection.createVirtualDisplay(DISPLAY_NAME, mWidth, mHeight, mDensity, DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR, mImageReader.getSurface(), null, handler);
}
mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
int onImageCount = 0;
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
FileOutputStream fos = null;
Bitmap bitmap = null;
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.KITKAT) {
image = reader.acquireLatestImage();
}
if (image != null) {
Image.Plane[] planes = new Image.Plane[0];
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.KITKAT) {
planes = image.getPlanes();
}
ByteBuffer buffer = planes[0].getBuffer();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * mWidth;
bitmap = Bitmap.createBitmap(mWidth + rowPadding / pixelStride, mHeight, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
SimpleDateFormat df = new SimpleDateFormat("dd-MM-yyyy_HH:mm:ss");
String formattedDate = df.format(Calendar.getInstance().getTime()).trim();
String finalDate = formattedDate.replace(":", "-");
String imgName = "/imagen"+ "_" + finalDate + ".jpg";
String dirPath = Environment.getExternalStorageDirectory().getAbsolutePath()
+ "/example";
File dir = new File(dirPath);
if(!dir.exists()){
dir.mkdirs();
}
String mPath = dirPath + imgName;
File imageFile = new File(mPath);
fos = new FileOutputStream(imageFile);
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);
Log.e(TAG, "captured image: " );
}
} catch (Exception e) {
e.printStackTrace();
} finally {
if (fos != null) {
try {
fos.close();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
if (bitmap != null) {
bitmap.recycle();
}
if (image != null) {
image.close();
}
}
}
}, handler);
mMediaProjection.stop();
mMediaProjection=null;
}
I don't know if you've already solved your problem, but here's my solution:
The method receives 4 parameters and the last one is the number of photos
enter code hereImageReader.newInstance(width: Int, height: Int, format: Int, maxImages: Int)
If you look good you are sending it in the last parameter the number of photos 2
enter code hereImageReader.newInstance(mWidth, mHeight, PixelFormat.RGBA_8888, 2);
enter link description here
I am trying to take a screenshot of my Augmented Reality Screen and pass it as a bitmap to another activity.
This is the code that I am using to take the screenshot:
Function to take screen shot
public static void tmpScreenshot(Bitmap bmp, Context context){
try {
//Write file
String filename = "bitmap.png";
FileOutputStream stream = context.openFileOutput(filename, Context.MODE_PRIVATE);
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream);
//Cleanup
stream.close();
bmp.recycle();
//Pop intent
Intent in1 = new Intent(context, CostActivity.class);
in1.putExtra("image", filename);
context.startActivity(in1);
} catch (Exception e) {
e.printStackTrace();
}
}
Function to receive screenshot
private void loadTmpBitmap() {
Bitmap bmp = null;
String filename = getIntent().getStringExtra("image");
try {
FileInputStream is = this.openFileInput(filename);
bmp = BitmapFactory.decodeStream(is);
ImageView imageView = findViewById(R.id.test);
imageView.setImageBitmap(Bitmap.createScaledBitmap(bmp, 120, 120, false));
is.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
Even though the Screenshot was taken, it was black when it is passed to another activity.
In addition, the Screenshot only appeared after I pressed the back button
Can anyone help me with the code to take a screenshot with ARCore? Or what am I doing wrong?
It is not possible to take a screenshot of a SurfaceView using your method. If you do then the screenshot will be black, as it only works for regular views.
What you need to use is pixelcopy.
private void takePhoto() {
ArSceneView view = arFragment.getArSceneView();
// Create a bitmap the size of the scene view.
final Bitmap bitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(),
Bitmap.Config.ARGB_8888);
// Create a handler thread to offload the processing of the image.
final HandlerThread handlerThread = new HandlerThread("PixelCopier");
handlerThread.start();
// Make the request to copy.
PixelCopy.request(view, bitmap, (copyResult) -> {
if (copyResult == PixelCopy.SUCCESS) {
try {
saveBitmapToDisk(bitmap);
} catch (IOException e) {
Toast toast = Toast.makeText(VisualizerActivity.this, e.toString(),
Toast.LENGTH_LONG);
toast.show();
return;
}
SnackbarUtility.showSnackbarTypeLong(settingsButton, "Screenshot saved in /Pictures/Screenshots");
} else {
SnackbarUtility.showSnackbarTypeLong(settingsButton, "Failed to take screenshot");
}
handlerThread.quitSafely();
}, new Handler(handlerThread.getLooper()));
}
public void saveBitmapToDisk(Bitmap bitmap) throws IOException {
// String path = Environment.getExternalStorageDirectory().toString() + "/Pictures/Screenshots/";
if (videoDirectory == null) {
videoDirectory =
new File(
Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES)
+ "/Screenshots");
}
Calendar c = Calendar.getInstance();
SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd HH.mm.ss");
String formattedDate = df.format(c.getTime());
File mediaFile = new File(videoDirectory, "FieldVisualizer"+formattedDate+".jpeg");
FileOutputStream fileOutputStream = new FileOutputStream(mediaFile);
bitmap.compress(Bitmap.CompressFormat.JPEG, 70, fileOutputStream);
fileOutputStream.flush();
fileOutputStream.close();
}
This question already has answers here:
How might I add a watermark effect to an image in Android?
(8 answers)
Closed 4 years ago.
I am using camera2 API in my android app and i want to make some png watermark on images, please if its possible help me to make it thank you in advance.
in other words, I want to make png watermark on captured photos is it possible?
in my opinion, camera2 API must have some method to make it but I havnt arrange of finding something on the internet there is not a lot of documentation, so i hope you know something about it.
private void takePicture() {
if(cameraDevice == null)
return;
CameraManager manager = (CameraManager)getSystemService(Context.CAMERA_SERVICE);
try{
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if(characteristics != null)
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)
.getOutputSizes(ImageFormat.JPEG);
//Capture image with custom size
int width = 1000;
int height = 1000;
/*
if(jpegSizes != null && jpegSizes.length > 0)
{
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
*/
final ImageReader reader = ImageReader.newInstance(width,height,ImageFormat.JPEG,1);
List<Surface> outputSurface = new ArrayList<>(2);
outputSurface.add(reader.getSurface());
outputSurface.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
//Check orientation base on device
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION,ORIENTATIONS.get(rotation));
file = new File(Environment.getExternalStorageDirectory()+"/"+UUID.randomUUID().toString()+".jpg");
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader imageReader) {
Image image = null;
try{
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
finally {
{
if(image != null)
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream outputStream = null;
try{
outputStream = new FileOutputStream(file);
outputStream.write(bytes);
}finally {
if(outputStream != null)
outputStream.close();
}
}
};
reader.setOnImageAvailableListener(readerListener,mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(MainActivity.this, "Saved "+file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurface, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
try{
cameraCaptureSession.capture(captureBuilder.build(),captureListener,mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
}
},mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
Custom image manipulation is unrelated to any Camera API, and therefore there are no methods to apply any type of watermark.
The solution is to load the photo as a bitmap, render the watermark through a canvas initialized with the bitmap, and save it again.
Some considerations you need to take into account:
You will need to correctly manage the OutOfMemoryError if loading a
large image. Here some tips to avoid any problem.
If saving again to JPEG, you may want to copy the original EXIF
details to the new modified image.
You should perform all this operation in a background thread to avoid blocking the UI, as the operation definitely will be time consuming.
I use Android CAMERA 2 API for the camera2 supporting phones on my app. Works great, but some times it captures 0kb images & gives a full white preview. Shows no errors or warnings on logcat. Below is my camera image capture & save code:
protected void takePictureViaCamera2() {
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
//final File file = new File(Environment.getExternalStorageDirectory()+"/pic.jpg");
final File file = createImageFile();
final String checkPath = file.getParent();
Log.e("checkPath", checkPath);
//photoFile = createImageFile();
//photoFilePath = photoFile.getParent();
//Log.e("photoFilePath", photoFilePath);
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
Log.e("readerListenerFNF_EX", e + "");
} catch (IOException e) {
e.printStackTrace();
Log.e("readerListenerIO_EX", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("readerListener_EX", e + "");
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
//output = new FileOutputStream(photoFile);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(TravelChargesCamera2Activity.this, "Saved***:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
closeCamera();
Thread thread = new Thread() {
#Override
public void run() {
try {
synchronized (this) {
wait(5000);
runOnUiThread(new Runnable() {
#Override
public void run() {
toolbar.setVisibility(View.VISIBLE);
header.setVisibility(View.VISIBLE);
assessorTraveldetails_scroll.setVisibility(View.VISIBLE);
camera2Layout.setVisibility(View.GONE);
try {
if (!file.getParentFile().isDirectory()) {
travelDirectory = new File(checkPath);
Log.e("travelDir==created==", travelDirectory + "");
} else {
travelDirectory = file.getParentFile();
Log.e("travelDir==exists==", travelDirectory + "");
}
files = travelDirectory.listFiles();
Log.e("files", files + "");
// removes old images from view
// prevents display duplication
travelGallery.removeAllViews();
// loop displays the photos captured on Horizontal ScrollView
for (File picFile : files) {
travelGallery.addView(insertPhoto(picFile.getAbsolutePath()));
byte[] byteArray = ImageUtils.fileToByteArray(file);
Log.e("byteArray", byteArray + "");
}
} catch (Exception e) {
e.printStackTrace();
Log.e("file", e + "");
}
}
});
}
} catch (InterruptedException e) {
e.printStackTrace();
Log.e("InterruptedException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("Exception", e + "");
}
}
};
thread.start();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
Log.e("CameraAccessException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("takePictureViaCamera2", e + "");
}
}
private File createImageFile() throws IOException {
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String uniqueId = UUID.randomUUID().toString();
String imageFileName = "CAP_" + timeStamp + "_" + uniqueId;
String bCode = "";
masterMap = (LinkedTreeMap) gson.fromJson(batchMasterObject.getMasterJson(), Object.class);
auditMap = (LinkedTreeMap) masterMap.get("2001");
if ((auditMap.get("batchCode") != null && !auditMap.get("batchCode").equals(""))) {
bCode = auditMap.get("batchCode").toString();
} else if (auditMap.get("cbc") != null && !auditMap.get("cbc").equals("")) {
bCode = auditMap.get("cbc").toString();
}
File storageDir = new File(getExternalFilesDir(Environment.DIRECTORY_PICTURES), "/" + bCode + "/Travel/");
if (!storageDir.exists()) {
storageDir.mkdirs();
}
String checkPath = storageDir.getPath();
Log.e("checkPath", checkPath);
Toast.makeText(getApplicationContext(), checkPath, Toast.LENGTH_LONG).show();
File image = File.createTempFile(
imageFileName, /* prefix */
".jpg", /* suffix */
storageDir /* directory */
);
// Save a file: path for use with ACTION_VIEW intents
mCurrentPhotoPath = storageDir.getAbsolutePath();
return image;
}
Could any one help on this?
NOTE: The camera, captures and stores an image file as intended, but the file size is 0kb. When the image file is opened on phone or pc, it says "Unsupported format"
It looks like you're taking your final JPEG capture as the very first frame.
This means the camera's auto-exposure routines have not had any time to work (since you're grabbing the first frame), so the image will be captured with the initial settings hardcoded into the camera driver. This may work for some scenes, but in others, the image will be far too bright or far too dark.
For best results, you should first let the camera meter for a while, until auto-exposure and auto-focus are converged.
Generally, this can be done by setting up a low-resolution preview output (use a SurfaceTexture for example), and running it for a second or two (or wait until a CaptureResult has an AE_STATE of CONVERGED). Then issue the JPEG capture.
I have an app that receives qr code from the server. I want to decode it (not with intent and camera) and display the text it contains in my app. I have alredy done this in Java SE with jars from zxing with this code:
private class QRCodeDecoder {
public String decode(File imageFile) {
BufferedImage image;
try {
image = ImageIO.read(imageFile);
} catch (IOException e1) {
return "io outch";
}
// creating luminance source
LuminanceSource lumSource = new BufferedImageLuminanceSource(image);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(lumSource));
// barcode decoding
QRCodeReader reader = new QRCodeReader();
Result result = null;
try {
result = reader.decode(bitmap);
} catch (ReaderException e) {
return "reader error";
}
return result.getText();
}
}
But on Android, BufferedImage is not found.
Has anyone decoded qr code on android from image stored on the phone?
Tnx.
In android,you can do it this way:
#Override
protected Result doInBackground(Void... params)
{
try
{
InputStream inputStream = activity.getContentResolver().openInputStream(uri);
Bitmap bitmap = BitmapFactory.decodeStream(inputStream);
if (bitmap == null)
{
Log.e(TAG, "uri is not a bitmap," + uri.toString());
return null;
}
int width = bitmap.getWidth(), height = bitmap.getHeight();
int[] pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
bitmap.recycle();
bitmap = null;
RGBLuminanceSource source = new RGBLuminanceSource(width, height, pixels);
BinaryBitmap bBitmap = new BinaryBitmap(new HybridBinarizer(source));
MultiFormatReader reader = new MultiFormatReader();
try
{
Result result = reader.decode(bBitmap);
return result;
}
catch (NotFoundException e)
{
Log.e(TAG, "decode exception", e);
return null;
}
}
catch (FileNotFoundException e)
{
Log.e(TAG, "can not open file" + uri.toString(), e);
return null;
}
}
Download ZXing from google code, and this class file: ZXing-1.6/zxing-1.6/androidtest/src/com/google/zxing/client/androidtest/RGBLuminanceSource.java can help you.
Quickmark and qr droid actually reads out what the code says, and you can decode barcodes saved on your phone. Hit the menu button when your load the image and select share, find decode qr droid or decode quickmark, and the'll do the magic. I prefer quickmark for reading codes, because it tells me what is typed in the code.