Android - Save image listener - java

I'm trying to save the picture on the device. I will change the activity when the image is saved. How do I know when a picture is saved?
Save image code:
public void onBitmapLoaded(final Bitmap bitmap, Picasso.LoadedFrom from) {
new Thread(new Runnable() {
#Override
public void run() {
File file = null;
try {
file = createImageFile();
} catch (IOException e) {
e.printStackTrace();
}
try {
FileOutputStream ostream = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, ostream);
ostream.flush();
ostream.close();
} catch (IOException e) {
Log.e("IOException", e.getLocalizedMessage());
}
}
}).start();

Saving the image in Gallery will save the image on your device eventually.
Try below code :
MediaStore.Images.Media.insertImage(getContentResolver(), yourBitmap, yourTitle , yourDescription);
public class CapturePhotoUtils {
/**
* A copy of the Android internals insertImage method, this method populates the
* meta data with DATE_ADDED and DATE_TAKEN. This fixes a common problem where media
* that is inserted manually gets saved at the end of the gallery (because date is not populated).
* #see android.provider.MediaStore.Images.Media#insertImage(ContentResolver, Bitmap, String, String)
*/
public static final String insertImage(ContentResolver cr,
Bitmap source,
String title,
String description) {
ContentValues values = new ContentValues();
values.put(Images.Media.TITLE, title);
values.put(Images.Media.DISPLAY_NAME, title);
values.put(Images.Media.DESCRIPTION, description);
values.put(Images.Media.MIME_TYPE, "image/jpeg");
// Add the date meta data to ensure the image is added at the front of the gallery
values.put(Images.Media.DATE_ADDED, System.currentTimeMillis());
values.put(Images.Media.DATE_TAKEN, System.currentTimeMillis());
Uri url = null;
String stringUrl = null; /* value to be returned */
try {
url = cr.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
if (source != null) {
OutputStream imageOut = cr.openOutputStream(url);
try {
source.compress(Bitmap.CompressFormat.JPEG, 50, imageOut);
} finally {
imageOut.close();
}
long id = ContentUris.parseId(url);
// Wait until MINI_KIND thumbnail is generated.
Bitmap miniThumb = Images.Thumbnails.getThumbnail(cr, id, Images.Thumbnails.MINI_KIND, null);
// This is for backward compatibility.
storeThumbnail(cr, miniThumb, id, 50F, 50F,Images.Thumbnails.MICRO_KIND);
} else {
cr.delete(url, null, null);
url = null;
}
} catch (Exception e) {
if (url != null) {
cr.delete(url, null, null);
url = null;
}
}
if (url != null) {
stringUrl = url.toString();
}
return stringUrl;
}
/**
* A copy of the Android internals StoreThumbnail method, it used with the insertImage to
* populate the android.provider.MediaStore.Images.Media#insertImage with all the correct
* meta data. The StoreThumbnail method is private so it must be duplicated here.
* #see android.provider.MediaStore.Images.Media (StoreThumbnail private method)
*/
private static final Bitmap storeThumbnail(
ContentResolver cr,
Bitmap source,
long id,
float width,
float height,
int kind) {
// create the matrix to scale it
Matrix matrix = new Matrix();
float scaleX = width / source.getWidth();
float scaleY = height / source.getHeight();
matrix.setScale(scaleX, scaleY);
Bitmap thumb = Bitmap.createBitmap(source, 0, 0,
source.getWidth(),
source.getHeight(), matrix,
true
);
ContentValues values = new ContentValues(4);
values.put(Images.Thumbnails.KIND,kind);
values.put(Images.Thumbnails.IMAGE_ID,(int)id);
values.put(Images.Thumbnails.HEIGHT,thumb.getHeight());
values.put(Images.Thumbnails.WIDTH,thumb.getWidth());
Uri url = cr.insert(Images.Thumbnails.EXTERNAL_CONTENT_URI, values);
try {
OutputStream thumbOut = cr.openOutputStream(url);
thumb.compress(Bitmap.CompressFormat.JPEG, 100, thumbOut);
thumbOut.close();
return thumb;
} catch (FileNotFoundException ex) {
return null;
} catch (IOException ex) {
return null;
}
}
}
To know if image is saved or not just put a check in storeThumbnail method.
Hope this answers your question.

Not quite sure what do intend to do but you can always save the image and then call the activity at the end of the function
Another way if using fileObserver : FileObserver examples

Related

How can I take Screenshots using Arcore?

I am trying to take a screenshot of my Augmented Reality Screen and pass it as a bitmap to another activity.
This is the code that I am using to take the screenshot:
Function to take screen shot
public static void tmpScreenshot(Bitmap bmp, Context context){
try {
//Write file
String filename = "bitmap.png";
FileOutputStream stream = context.openFileOutput(filename, Context.MODE_PRIVATE);
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream);
//Cleanup
stream.close();
bmp.recycle();
//Pop intent
Intent in1 = new Intent(context, CostActivity.class);
in1.putExtra("image", filename);
context.startActivity(in1);
} catch (Exception e) {
e.printStackTrace();
}
}
Function to receive screenshot
private void loadTmpBitmap() {
Bitmap bmp = null;
String filename = getIntent().getStringExtra("image");
try {
FileInputStream is = this.openFileInput(filename);
bmp = BitmapFactory.decodeStream(is);
ImageView imageView = findViewById(R.id.test);
imageView.setImageBitmap(Bitmap.createScaledBitmap(bmp, 120, 120, false));
is.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
Even though the Screenshot was taken, it was black when it is passed to another activity.
In addition, the Screenshot only appeared after I pressed the back button
Can anyone help me with the code to take a screenshot with ARCore? Or what am I doing wrong?
It is not possible to take a screenshot of a SurfaceView using your method. If you do then the screenshot will be black, as it only works for regular views.
What you need to use is pixelcopy.
private void takePhoto() {
ArSceneView view = arFragment.getArSceneView();
// Create a bitmap the size of the scene view.
final Bitmap bitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(),
Bitmap.Config.ARGB_8888);
// Create a handler thread to offload the processing of the image.
final HandlerThread handlerThread = new HandlerThread("PixelCopier");
handlerThread.start();
// Make the request to copy.
PixelCopy.request(view, bitmap, (copyResult) -> {
if (copyResult == PixelCopy.SUCCESS) {
try {
saveBitmapToDisk(bitmap);
} catch (IOException e) {
Toast toast = Toast.makeText(VisualizerActivity.this, e.toString(),
Toast.LENGTH_LONG);
toast.show();
return;
}
SnackbarUtility.showSnackbarTypeLong(settingsButton, "Screenshot saved in /Pictures/Screenshots");
} else {
SnackbarUtility.showSnackbarTypeLong(settingsButton, "Failed to take screenshot");
}
handlerThread.quitSafely();
}, new Handler(handlerThread.getLooper()));
}
public void saveBitmapToDisk(Bitmap bitmap) throws IOException {
// String path = Environment.getExternalStorageDirectory().toString() + "/Pictures/Screenshots/";
if (videoDirectory == null) {
videoDirectory =
new File(
Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES)
+ "/Screenshots");
}
Calendar c = Calendar.getInstance();
SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-dd HH.mm.ss");
String formattedDate = df.format(c.getTime());
File mediaFile = new File(videoDirectory, "FieldVisualizer"+formattedDate+".jpeg");
FileOutputStream fileOutputStream = new FileOutputStream(mediaFile);
bitmap.compress(Bitmap.CompressFormat.JPEG, 70, fileOutputStream);
fileOutputStream.flush();
fileOutputStream.close();
}

How to Save images in an specific folder android

I'm a newbie about android programming
I would like to create folder for store image from my app when user click save
I can save image to my device
but I have no idea to create specific folder ,How to create folder??
thanks for your help!
and sorry my english is not good.
My code
CapturePhotoUtils.insertImage(context.getContentResolver(), myBitmap, title ,des);
and CapturePhotoUtils.java
public class CapturePhotoUtils {
public static final String insertImage(ContentResolver cr,
Bitmap source,
String title,
String description) {
ContentValues values = new ContentValues();
values.put(Images.Media.TITLE, title);
values.put(Images.Media.DISPLAY_NAME, title);
values.put(Images.Media.DESCRIPTION, description);
values.put(Images.Media.MIME_TYPE, "image/jpeg");
// Add the date meta data to ensure the image is added at the front of the gallery
values.put(Images.Media.DATE_ADDED, System.currentTimeMillis());
values.put(Images.Media.DATE_TAKEN, System.currentTimeMillis());
Uri url = null;
String stringUrl = null; /* value to be returned */
try {
url = cr.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
if (source != null) {
OutputStream imageOut = cr.openOutputStream(url);
try {
source.compress(Bitmap.CompressFormat.JPEG, 50, imageOut);
} finally {
imageOut.close();
}
long id = ContentUris.parseId(url);
// Wait until MINI_KIND thumbnail is generated.
Bitmap miniThumb = Images.Thumbnails.getThumbnail(cr, id, Images.Thumbnails.MINI_KIND, null);
// This is for backward compatibility.
storeThumbnail(cr, miniThumb, id, 50F, 50F,Images.Thumbnails.MICRO_KIND);
} else {
cr.delete(url, null, null);
url = null;
}
} catch (Exception e) {
if (url != null) {
cr.delete(url, null, null);
url = null;
}
}
if (url != null) {
stringUrl = url.toString();
}
return stringUrl;
}
private static final Bitmap storeThumbnail(
ContentResolver cr,
Bitmap source,
long id,
float width,
float height,
int kind) {
// create the matrix to scale it
Matrix matrix = new Matrix();
float scaleX = width / source.getWidth();
float scaleY = height / source.getHeight();
matrix.setScale(scaleX, scaleY);
Bitmap thumb = Bitmap.createBitmap(source, 0, 0,
source.getWidth(),
source.getHeight(), matrix,
true
);
ContentValues values = new ContentValues(4);
values.put(Images.Thumbnails.KIND,kind);
values.put(Images.Thumbnails.IMAGE_ID,(int)id);
values.put(Images.Thumbnails.HEIGHT,thumb.getHeight());
values.put(Images.Thumbnails.WIDTH,thumb.getWidth());
Uri url = cr.insert(Images.Thumbnails.EXTERNAL_CONTENT_URI, values);
try {
OutputStream thumbOut = cr.openOutputStream(url);
thumb.compress(Bitmap.CompressFormat.JPEG, 100, thumbOut);
thumbOut.close();
return thumb;
} catch (FileNotFoundException ex) {
return null;
} catch (IOException ex) {
return null;
}
}
try this
addToFav("/Favorite", "add to favoriote");
create this function
public void addToFav(String dirName, String str) {
String timeStamp = new SimpleDateFormat("ddMMyyyy_HHmmss").format(new Date());
String fileName = "fav" + timeStamp + ".JPG";
File direct = new File(Environment.getExternalStorageDirectory() + dirName);
if (!direct.exists()) {
File wallpaperDirectory = new File(Environment.getExternalStorageDirectory() + dirName);
wallpaperDirectory.mkdirs();
}
File file = new File(new File(Environment.getExternalStorageDirectory() + dirName), fileName);
if (file.exists()) {
file.delete();
}
try {
FileOutputStream out = new FileOutputStream(file);
Bitmap bitmap = BitmapFactory.decodeFile(imagesPathArrayList.get(pos));
bitmap.compress(Bitmap.CompressFormat.JPEG, 20, out);
out.flush();
out.close();
} catch (Exception e) {
e.printStackTrace();
}
ContentValues values = new ContentValues();
values.put(Images.Media.TITLE, "title");
values.put(Images.Media.DESCRIPTION, "description");
values.put(Images.Media.DATE_TAKEN, System.currentTimeMillis());
values.put(Images.ImageColumns.BUCKET_ID, file.toString().toLowerCase(Locale.US).hashCode());
values.put(Images.ImageColumns.BUCKET_DISPLAY_NAME, file.getName().toLowerCase(Locale.US));
values.put("_data", file.getAbsolutePath());
ContentResolver cr = getContentResolver();
cr.insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
}
and don't forget to add permission in manifest file

Android CAMERA 2 API captures 0kb images & gives a full white preview. Shows no errors or warnings on logcat

I use Android CAMERA 2 API for the camera2 supporting phones on my app. Works great, but some times it captures 0kb images & gives a full white preview. Shows no errors or warnings on logcat. Below is my camera image capture & save code:
protected void takePictureViaCamera2() {
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
//final File file = new File(Environment.getExternalStorageDirectory()+"/pic.jpg");
final File file = createImageFile();
final String checkPath = file.getParent();
Log.e("checkPath", checkPath);
//photoFile = createImageFile();
//photoFilePath = photoFile.getParent();
//Log.e("photoFilePath", photoFilePath);
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
Log.e("readerListenerFNF_EX", e + "");
} catch (IOException e) {
e.printStackTrace();
Log.e("readerListenerIO_EX", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("readerListener_EX", e + "");
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
//output = new FileOutputStream(photoFile);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(TravelChargesCamera2Activity.this, "Saved***:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
closeCamera();
Thread thread = new Thread() {
#Override
public void run() {
try {
synchronized (this) {
wait(5000);
runOnUiThread(new Runnable() {
#Override
public void run() {
toolbar.setVisibility(View.VISIBLE);
header.setVisibility(View.VISIBLE);
assessorTraveldetails_scroll.setVisibility(View.VISIBLE);
camera2Layout.setVisibility(View.GONE);
try {
if (!file.getParentFile().isDirectory()) {
travelDirectory = new File(checkPath);
Log.e("travelDir==created==", travelDirectory + "");
} else {
travelDirectory = file.getParentFile();
Log.e("travelDir==exists==", travelDirectory + "");
}
files = travelDirectory.listFiles();
Log.e("files", files + "");
// removes old images from view
// prevents display duplication
travelGallery.removeAllViews();
// loop displays the photos captured on Horizontal ScrollView
for (File picFile : files) {
travelGallery.addView(insertPhoto(picFile.getAbsolutePath()));
byte[] byteArray = ImageUtils.fileToByteArray(file);
Log.e("byteArray", byteArray + "");
}
} catch (Exception e) {
e.printStackTrace();
Log.e("file", e + "");
}
}
});
}
} catch (InterruptedException e) {
e.printStackTrace();
Log.e("InterruptedException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("Exception", e + "");
}
}
};
thread.start();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
Log.e("CameraAccessException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("takePictureViaCamera2", e + "");
}
}
private File createImageFile() throws IOException {
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String uniqueId = UUID.randomUUID().toString();
String imageFileName = "CAP_" + timeStamp + "_" + uniqueId;
String bCode = "";
masterMap = (LinkedTreeMap) gson.fromJson(batchMasterObject.getMasterJson(), Object.class);
auditMap = (LinkedTreeMap) masterMap.get("2001");
if ((auditMap.get("batchCode") != null && !auditMap.get("batchCode").equals(""))) {
bCode = auditMap.get("batchCode").toString();
} else if (auditMap.get("cbc") != null && !auditMap.get("cbc").equals("")) {
bCode = auditMap.get("cbc").toString();
}
File storageDir = new File(getExternalFilesDir(Environment.DIRECTORY_PICTURES), "/" + bCode + "/Travel/");
if (!storageDir.exists()) {
storageDir.mkdirs();
}
String checkPath = storageDir.getPath();
Log.e("checkPath", checkPath);
Toast.makeText(getApplicationContext(), checkPath, Toast.LENGTH_LONG).show();
File image = File.createTempFile(
imageFileName, /* prefix */
".jpg", /* suffix */
storageDir /* directory */
);
// Save a file: path for use with ACTION_VIEW intents
mCurrentPhotoPath = storageDir.getAbsolutePath();
return image;
}
Could any one help on this?
NOTE: The camera, captures and stores an image file as intended, but the file size is 0kb. When the image file is opened on phone or pc, it says "Unsupported format"
It looks like you're taking your final JPEG capture as the very first frame.
This means the camera's auto-exposure routines have not had any time to work (since you're grabbing the first frame), so the image will be captured with the initial settings hardcoded into the camera driver. This may work for some scenes, but in others, the image will be far too bright or far too dark.
For best results, you should first let the camera meter for a while, until auto-exposure and auto-focus are converged.
Generally, this can be done by setting up a low-resolution preview output (use a SurfaceTexture for example), and running it for a second or two (or wait until a CaptureResult has an AE_STATE of CONVERGED). Then issue the JPEG capture.

The right way to load a PDF into an ImageView

I'm looking for the proper way to load a PDF File into an ImageView.
I use the BitmapWorkerTask class from Android.
I have a button which allow me to choose the File I want to upload into the ImageView. When I click on this file, the process begin.
My issue is that my PDF is perfectly load 3 time 5. And I don't understand why it's not working the other time.
ImageView map;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
map = (ImageView)findViewById(R.id.pdf);
}
/**
* Use this to load a pdf file from your assets and render it to a Bitmap.
*
* #param context
* current context.
* #param filePath
* of the pdf file in the assets.
* #return a bitmap.
*/
#Nullable
public static Bitmap renderToBitmap(Context context, String filePath) {
Bitmap bi = null;
InputStream inStream = null;
try {
inStream = new FileInputStream(filePath);
bi = renderToBitmap(context, inStream);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
inStream.close();
} catch (IOException e) {
// do nothing because the stream has already been closed
}
}
return bi;
}
/**
* Use this to render a pdf file given as InputStream to a Bitmap.
*
* #param context
* current context.
* #param inStream
* the inputStream of the pdf file.
* #return a bitmap.
* #see https://github.com/jblough/Android-Pdf-Viewer-Library/
*/
#Nullable
public static Bitmap renderToBitmap(Context context, InputStream inStream) {
Bitmap bi = null;
try {
byte[] decode = IOUtils.toByteArray(inStream);
ByteBuffer buf = ByteBuffer.wrap(decode);
PDFPage mPdfPage = new PDFFile(buf).getPage(0);
float width = mPdfPage.getWidth();
float height = mPdfPage.getHeight();
RectF clip = null;
bi = mPdfPage.getImage((int) (width), (int) (height), clip, true,
true);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
inStream.close();
} catch (IOException e) {
// do nothing because the stream has already been closed
}
}
return bi;
}
private void renderMap() {
String mapFilePath = Environment.getExternalStorageDirectory().toString()+"/Android/data/com.empower.data/"+mapFileName;
BitmapWorkerTask task = new BitmapWorkerTask(map, mapFilePath, MainActivity.this);
task.loadBitmap(R.id.pdf, map, mapFilePath);
map.setVisibility(View.VISIBLE);
map.invalidate();
}
And my BitmapWorkerTask.java
public class BitmapWorkerTask extends AsyncTask<Integer, Void, Bitmap> {
private final WeakReference<ImageView> imageViewReference;
private int data = 0;
private String mapFilePath1;
private Context context;
private RelativeLayout loadingPanel;
public BitmapWorkerTask(ImageView imageView, String mapFilePath2, Context context) {
// Use a WeakReference to ensure the ImageView can be garbage collected
this.context = context;
mapFilePath1 = mapFilePath2;
imageViewReference = new WeakReference<ImageView>(imageView);
}
// Decode image in background.
#Override
protected Bitmap doInBackground(Integer... params) {
data = params[0];
Bitmap bm = MainActivity.renderToBitmap(context , mapFilePath1);
return bm;
}
#Override
protected void onPreExecute() {
loadingPanel = (RelativeLayout)((Activity) context).findViewById(R.id.loadingPanel);
loadingPanel.setVisibility(View.VISIBLE);
}
// Once complete, see if ImageView is still around and set bitmap.
#Override
protected void onPostExecute(final Bitmap bitmap) {
if (imageViewReference != null && bitmap != null) {
final ImageView imageView = imageViewReference.get();
if (imageView != null) {
new Handler().postDelayed(new Runnable() {
#Override
public void run() {
imageView.setImageBitmap(bitmap);
loadingPanel.setVisibility(View.GONE);
}
}, 500);
}
}
}
public void loadBitmap(int resId, ImageView imageView, String mapFilePath1) {
BitmapWorkerTask task = new BitmapWorkerTask(imageView, mapFilePath1, context);
task.execute(resId);
}
}

Decoding qr code from image stored on the phone with Zxing (on Android phone)

I have an app that receives qr code from the server. I want to decode it (not with intent and camera) and display the text it contains in my app. I have alredy done this in Java SE with jars from zxing with this code:
private class QRCodeDecoder {
public String decode(File imageFile) {
BufferedImage image;
try {
image = ImageIO.read(imageFile);
} catch (IOException e1) {
return "io outch";
}
// creating luminance source
LuminanceSource lumSource = new BufferedImageLuminanceSource(image);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(lumSource));
// barcode decoding
QRCodeReader reader = new QRCodeReader();
Result result = null;
try {
result = reader.decode(bitmap);
} catch (ReaderException e) {
return "reader error";
}
return result.getText();
}
}
But on Android, BufferedImage is not found.
Has anyone decoded qr code on android from image stored on the phone?
Tnx.
In android,you can do it this way:
#Override
protected Result doInBackground(Void... params)
{
try
{
InputStream inputStream = activity.getContentResolver().openInputStream(uri);
Bitmap bitmap = BitmapFactory.decodeStream(inputStream);
if (bitmap == null)
{
Log.e(TAG, "uri is not a bitmap," + uri.toString());
return null;
}
int width = bitmap.getWidth(), height = bitmap.getHeight();
int[] pixels = new int[width * height];
bitmap.getPixels(pixels, 0, width, 0, 0, width, height);
bitmap.recycle();
bitmap = null;
RGBLuminanceSource source = new RGBLuminanceSource(width, height, pixels);
BinaryBitmap bBitmap = new BinaryBitmap(new HybridBinarizer(source));
MultiFormatReader reader = new MultiFormatReader();
try
{
Result result = reader.decode(bBitmap);
return result;
}
catch (NotFoundException e)
{
Log.e(TAG, "decode exception", e);
return null;
}
}
catch (FileNotFoundException e)
{
Log.e(TAG, "can not open file" + uri.toString(), e);
return null;
}
}
Download ZXing from google code, and this class file: ZXing-1.6/zxing-1.6/androidtest/src/com/google/zxing/client/androidtest/RGBLuminanceSource.java can help you.
Quickmark and qr droid actually reads out what the code says, and you can decode barcodes saved on your phone. Hit the menu button when your load the image and select share, find decode qr droid or decode quickmark, and the'll do the magic. I prefer quickmark for reading codes, because it tells me what is typed in the code.

Categories

Resources