Looking to add GPS location data to an image as it saves, as far as i'm aware this needs to be done after the outputStream closes. I've written the code and implemented it however no GPS location data is written. As far as i'm aware i have prepared all the relevant data that would be required for the EXIF Data.
GPS Converting Code
public String getLon(Location location) {
if (location == null) return "0/1,0/1,0/1000";
String[] degMinSec = Location.convert(location.getLongitude(), Location.FORMAT_SECONDS).split(":");
return degMinSec[0] + "/1," + degMinSec[1] + "/1," + degMinSec[2] + "/1000";
}
public String getLat(Location location) {
if (location == null) return "0/1,0/1,0/1000";
String[] degMinSec = Location.convert(location.getLatitude(), Location.FORMAT_SECONDS).split(":");
return degMinSec[0] + "/1," + degMinSec[1] + "/1," + degMinSec[2] + "/1000";
}
Code to save the Image
file = new File(Environment.getExternalStorageDirectory() + "/DCIM/GeoVista/" + UUID.randomUUID().toString() + ".jpg");
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader imageReader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
storeGeoCoordsToImage(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
{
if (image != null)
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream outputStream = null;
try {
outputStream = new FileOutputStream(file);
outputStream.write(bytes);
} finally {
if (outputStream != null)
outputStream.close();
}
}
};
EXIF DATA CODE
public boolean storeGeoCoordsToImage(File file, Location location) {
// Avoid NullPointer
if (file == null || location == null) return false;
try {
ExifInterface exif = new ExifInterface(file.getAbsolutePath());
exif.setAttribute(ExifInterface.TAG_GPS_LATITUDE, getLat(location));
exif.setAttribute(ExifInterface.TAG_GPS_LATITUDE_REF, location.getLatitude() < 0 ? "S" : "N");
exif.setAttribute(ExifInterface.TAG_GPS_LONGITUDE, getLon(location));
exif.setAttribute(ExifInterface.TAG_GPS_LONGITUDE_REF, location.getLongitude() < 0 ? "W" : "E");
exif.saveAttributes();
} catch (IOException e) {
// do something
return false;
}
// Data was likely written. For sure no NullPointer.
return true;
}
Any Input would be greatly appreciated.
Related
This error occurs mostly times when select images from recent folder
class com.bumptech.glide.load.engine.GlideException: Received null model
Call multiple images select
Sample Preview
Intent i = new Intent(Intent.ACTION_GET_CONTENT);
i.addCategory(Intent.CATEGORY_OPENABLE);
i.setType("image/*");
i.putExtra(Intent.EXTRA_ALLOW_MULTIPLE, true);
*gallery*.launch(i);
gallery basically startActivityForResult(i,123) and OnActivityResult method is deprecated, the gallery is alternative which is defined below
ActivityResultLauncher<Intent> gallery = choosePhotoFromGallery();
and choosePhotoFromGallery() is method which is define below
private ActivityResultLauncher<Intent> choosePhotoFromGallery() {
return registerForActivityResult(
new ActivityResultContracts.StartActivityForResult(),
result -> {
try {
if (result.getResultCode() == RESULT_OK) {
if (null != result.getData()) {
if (result.getData().getClipData() != null) {
ClipData mClipData = result.getData().getClipData();
for (int i = 0; i < mClipData.getItemCount(); i++) {
ClipData.Item item = mClipData.getItemAt(i);
Uri uri = item.getUri();
String imageFilePathColumn = getPathFromURI(this, uri);
productImagesList.add(imageFilePathColumn);
}
} else {
if (result.getData().getData() != null) {
Uri mImageUri = result.getData().getData();
String imageFilePathColumn = getPathFromURI(this, mImageUri);
productImagesList.add(imageFilePathColumn);
}
}
} else {
showToast(this, "You haven't picked Image");
productImagesList.clear();
}
} else {
productImagesList.clear();
}
} catch (Exception e) {
e.printStackTrace();
showToast(this, "Something went wrong");
productImagesList.clear();
}
});
}
and getPathFromURI() is method which define below
public String getPathFromURI(Context context, Uri contentUri) {
OutputStream out;
File file = getPath();
try {
if (file.createNewFile()) {
InputStream iStream = context != null ? context.getContentResolver().openInputStream(contentUri) : context.getContentResolver().openInputStream(contentUri);
byte[] inputData = getBytes(iStream);
out = new FileOutputStream(file);
out.write(inputData);
out.close();
return file.getAbsolutePath();
}
} catch (IOException e) {
e.printStackTrace();
}
return null;
}
private byte[] getBytes(InputStream inputStream) throws IOException {
ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream();
int bufferSize = 1024;
byte[] buffer = new byte[bufferSize];
int len = 0;
while ((len = inputStream.read(buffer)) != -1) {
byteBuffer.write(buffer, 0, len);
}
return byteBuffer.toByteArray();
}
and the getPath() is
private File getPath() {
File folder = new File(Environment.getExternalStorageDirectory(), "Download");
if (!folder.exists()) {
folder.mkdir();
}
return new File(folder.getPath(), System.currentTimeMillis() + ".jpg");
}
THANK YOU IN ADVANCE HAPPY CODING
I want to get the attached content of MMS like Image or Video/audio.
First I make this
static void getMmsContent(Context context, ArrayList<Mms> mmsArrayList) {
try {
for (Mms unMms : mmsArrayList) {
ContentResolver contentResolver = context.getContentResolver();
Uri uri = Uri.parse("content://mms/part");
String selection = Telephony.Mms.Part.MSG_ID + "=" + unMms.getId();
Cursor query = contentResolver.query(uri, null, selection, null, null);
if (query != null && query.moveToFirst()) {
do {
String name = query.getString(query.getColumnIndex("name"));
String type = query.getString(query.getColumnIndex("ct"));
String txt = query.getString(query.getColumnIndex(Telephony.Mms.Part.TEXT));
String data = query.getString(query.getColumnIndex(Telephony.Mms.Part._DATA));
if (!type.equals("application/smil")) {
String[] dataMms = {name, type, txt, data};
getContent(context, dataMms, unMms);
}
} while (query.moveToNext());
}
if (query != null) {
query.close();
}
}
} catch (Exception e) {
Log.d("Exception", e.toString());
}
}
This line give me the path to the location of the attached content.
String data = query.getString(query.getColumnIndex(Telephony.Mms.Part._DATA));
/data/user_de/0/com.android.providers.telephony/app_parts/PART_1555841710097_Screenshot_20190421-121445_Chrome1.jpg
So now i want to transform the image to a Bitmap to add it to a zip file.
static private void getContent(Context context, String[] dataMms, Mms unMms){
if (dataMms[1].equals("text/plain")) {
unMms.setCorps(dataMms[2]);
} else {
if ("image/jpeg".equals(dataMms[1]) || "image/bmp".equals(dataMms[1]) ||
"image/gif".equals(dataMms[1]) || "image/jpg".equals(dataMms[1]) ||
"image/png".equals(dataMms[1])) {
unMms.setTypeContenu(dataMms[1]);
Bitmap bitmap = null;
InputStream is = null;
try {
File source = new File(dataMms[3]);
is = new FileInputStream(source);
bitmap = BitmapFactory.decodeStream(is);
} catch (IOException e) {
Log.d("Exception", e.toString());
} finally {
if (is != null) {
try {
is.close();
} catch (IOException e) {
Log.d("Exception", e.toString());
}
}
}
if (bitmap != null) {
File file = new File(context.getApplicationInfo().dataDir + "/files/", dataMms[0]);
OutputStream Fout = null;
try {
Fout = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, Fout);
Fout.flush();
Fout.close();
} catch (FileNotFoundException e) {
Log.d("Exception", e.toString());
} catch (IOException e) {
Log.d("Exception", e.toString());
}
}
}
}
}
But my code Throw a Exception on new FileInputStream(source);
I got this
D/Exception: java.io.FileNotFoundException: /data/user_de/0/com.android.providers.telephony/app_parts/PART_1547316880687_Resized_20190112_191438_9422.jpeg (Permission denied)
I have the permissions and i have require the user permission.
<uses-permission android:name="android.permission.READ_SMS" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
So i change my code after the comment of CommonsWare to this :
static private void getContent(Context context, String[] dataMms, Mms unMms) {
if (dataMms[1].equals("text/plain")) {
unMms.setCorps(dataMms[2]);
} else {
if ("image/jpeg".equals(dataMms[1]) || "image/bmp".equals(dataMms[1]) ||
"image/gif".equals(dataMms[1]) || "image/jpg".equals(dataMms[1]) ||
"image/png".equals(dataMms[1])) {
unMms.setTypeContenu(dataMms[1]);
Uri partURI = Uri.parse("content://mms/part/" + dataMms[4]);
InputStream is = null;
Bitmap bitmap = null;
try {
is = context.getContentResolver().openInputStream(partURI);
bitmap = BitmapFactory.decodeStream(is);
} catch (IOException e) {
Log.d("Exception", e.toString());
} finally {
if (is != null) {
try {
is.close();
} catch (IOException e) {
Log.d("Exception", e.toString());
}
}
}
if (bitmap != null) {
File file = new File(context.getApplicationInfo().dataDir + "/files/", dataMms[0]);
OutputStream Fout = null;
try {
file.createNewFile();
Fout = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, Fout);
Fout.flush();
Fout.close();
} catch (FileNotFoundException e) {
Log.d("Exception", e.toString());
} catch (IOException e) {
Log.d("Exception", e.toString());
}
}
}
}
}
The tricky part is this :
Uri partURI = Uri.parse("content://mms/part/" + dataMms[4]);
my dataMms[4] is the id of the MMS Part, I get it from this line I put on getMmsContent() :
String id = query.getString(query.getColumnIndex("_id"));
This column give me the id of the part.
But there is no mention about this column in Android Developer documentation : https://developer.android.com/reference/android/provider/Telephony.Mms.Part.html
So I listed the columns with this code in getMmsContent() and I found it :
for (int i = 0; i < query.getColumnCount(); i++) {
Log.i("Column", query.getColumnName(i));
}
Now It's working !
I use Android CAMERA 2 API for the camera2 supporting phones on my app. Works great, but some times it captures 0kb images & gives a full white preview. Shows no errors or warnings on logcat. Below is my camera image capture & save code:
protected void takePictureViaCamera2() {
if (null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
//final File file = new File(Environment.getExternalStorageDirectory()+"/pic.jpg");
final File file = createImageFile();
final String checkPath = file.getParent();
Log.e("checkPath", checkPath);
//photoFile = createImageFile();
//photoFilePath = photoFile.getParent();
//Log.e("photoFilePath", photoFilePath);
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
Log.e("readerListenerFNF_EX", e + "");
} catch (IOException e) {
e.printStackTrace();
Log.e("readerListenerIO_EX", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("readerListener_EX", e + "");
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
//output = new FileOutputStream(photoFile);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
Toast.makeText(TravelChargesCamera2Activity.this, "Saved***:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
closeCamera();
Thread thread = new Thread() {
#Override
public void run() {
try {
synchronized (this) {
wait(5000);
runOnUiThread(new Runnable() {
#Override
public void run() {
toolbar.setVisibility(View.VISIBLE);
header.setVisibility(View.VISIBLE);
assessorTraveldetails_scroll.setVisibility(View.VISIBLE);
camera2Layout.setVisibility(View.GONE);
try {
if (!file.getParentFile().isDirectory()) {
travelDirectory = new File(checkPath);
Log.e("travelDir==created==", travelDirectory + "");
} else {
travelDirectory = file.getParentFile();
Log.e("travelDir==exists==", travelDirectory + "");
}
files = travelDirectory.listFiles();
Log.e("files", files + "");
// removes old images from view
// prevents display duplication
travelGallery.removeAllViews();
// loop displays the photos captured on Horizontal ScrollView
for (File picFile : files) {
travelGallery.addView(insertPhoto(picFile.getAbsolutePath()));
byte[] byteArray = ImageUtils.fileToByteArray(file);
Log.e("byteArray", byteArray + "");
}
} catch (Exception e) {
e.printStackTrace();
Log.e("file", e + "");
}
}
});
}
} catch (InterruptedException e) {
e.printStackTrace();
Log.e("InterruptedException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("Exception", e + "");
}
}
};
thread.start();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
#Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
#Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
Log.e("CameraAccessException", e + "");
} catch (Exception e) {
e.printStackTrace();
Log.e("takePictureViaCamera2", e + "");
}
}
private File createImageFile() throws IOException {
// Create an image file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String uniqueId = UUID.randomUUID().toString();
String imageFileName = "CAP_" + timeStamp + "_" + uniqueId;
String bCode = "";
masterMap = (LinkedTreeMap) gson.fromJson(batchMasterObject.getMasterJson(), Object.class);
auditMap = (LinkedTreeMap) masterMap.get("2001");
if ((auditMap.get("batchCode") != null && !auditMap.get("batchCode").equals(""))) {
bCode = auditMap.get("batchCode").toString();
} else if (auditMap.get("cbc") != null && !auditMap.get("cbc").equals("")) {
bCode = auditMap.get("cbc").toString();
}
File storageDir = new File(getExternalFilesDir(Environment.DIRECTORY_PICTURES), "/" + bCode + "/Travel/");
if (!storageDir.exists()) {
storageDir.mkdirs();
}
String checkPath = storageDir.getPath();
Log.e("checkPath", checkPath);
Toast.makeText(getApplicationContext(), checkPath, Toast.LENGTH_LONG).show();
File image = File.createTempFile(
imageFileName, /* prefix */
".jpg", /* suffix */
storageDir /* directory */
);
// Save a file: path for use with ACTION_VIEW intents
mCurrentPhotoPath = storageDir.getAbsolutePath();
return image;
}
Could any one help on this?
NOTE: The camera, captures and stores an image file as intended, but the file size is 0kb. When the image file is opened on phone or pc, it says "Unsupported format"
It looks like you're taking your final JPEG capture as the very first frame.
This means the camera's auto-exposure routines have not had any time to work (since you're grabbing the first frame), so the image will be captured with the initial settings hardcoded into the camera driver. This may work for some scenes, but in others, the image will be far too bright or far too dark.
For best results, you should first let the camera meter for a while, until auto-exposure and auto-focus are converged.
Generally, this can be done by setting up a low-resolution preview output (use a SurfaceTexture for example), and running it for a second or two (or wait until a CaptureResult has an AE_STATE of CONVERGED). Then issue the JPEG capture.
using this url I wrote below code to encode onpreviewframe data to mp4 video and I used a thread to do this job well, but it seems that it doesn't work properly.
private void initCodec() {
String root = Environment.getExternalStorageDirectory().toString();
File myDir = new File(root + "/Vocalist");
if(!myDir.exists()) {
myDir.mkdirs();
}
try {
File file = new File (myDir, "myVideo.mp4");
if(file.exists()){
file.delete();
}
fos = new FileOutputStream(file, false);
} catch (FileNotFoundException e) {
e.printStackTrace();
}try {
mMediaCodec = MediaCodec.createEncoderByType("video/avc");
}
catch (Exception e){
}
MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc",
320,
240);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 500000);
mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
mMediaCodec.configure(mediaFormat,
null,
null,
MediaCodec.CONFIGURE_FLAG_ENCODE);
mMediaCodec.start();
inputBuffers = mMediaCodec.getInputBuffers();
outputBuffers = mMediaCodec.getOutputBuffers();
}
private synchronized void encode(byte[] dataInput)
{
byte[] data = dataInput;
inputBuffers = mMediaCodec.getInputBuffers();// here changes
outputBuffers = mMediaCodec.getOutputBuffers();
int inputBufferIndex = mMediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(data);
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0);
} else {
return;
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo, 0);
Log.i("tag", "outputBufferIndex-->" + outputBufferIndex);
do {
if (outputBufferIndex >= 0) {
ByteBuffer outBuffer = outputBuffers[outputBufferIndex];
System.out.println("buffer info-->" + bufferInfo.offset + "--"
+ bufferInfo.size + "--" + bufferInfo.flags + "--"
+ bufferInfo.presentationTimeUs);
byte[] outData = new byte[bufferInfo.size];
outBuffer.get(outData);
try {
if (bufferInfo.offset != 0) {
fos.write(outData, bufferInfo.offset, outData.length
- bufferInfo.offset);
} else {
fos.write(outData, 0, outData.length);
}
fos.flush();
Log.i("camera", "out data -- > " + outData.length);
mMediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mMediaCodec.dequeueOutputBuffer(bufferInfo,
0);
} catch (IOException e) {
e.printStackTrace();
}
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
outputBuffers = mMediaCodec.getOutputBuffers();
} else if (outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
MediaFormat format = mMediaCodec.getOutputFormat();
}
} while (outputBufferIndex >= 0);
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
if (mHolder.getSurface() == null) {
return;
}
try {
initCodec();
mCamera.setPreviewDisplay(mHolder);
mCamera.setPreviewCallback(new Camera.PreviewCallback() {
#Override
public void onPreviewFrame(final byte[] bytes, Camera camera) {
if (recording == true) {
if(mThread.isAlive())
encode(bytes);
}
}
});
} catch (Exception e) {
Log.d("TAG", "Error starting camera preview: " + e.getMessage());
}
}
}
public void newOpenCamera() {
if (mThread == null) {
mThread = new CameraHandlerThread();
}
synchronized (mThread) {
mThread.openCamera();
}
}
private static void oldOpenCamera() {
try {
c = Camera.open(1);
Camera.Parameters parameters = c.getParameters();
parameters.set("orientation", "portrait");
parameters.setJpegQuality(100);
parameters.setPreviewFormat(ImageFormat.NV21);
parameters.setPreviewSize(320, 240);
c.setParameters(parameters);
}
catch (RuntimeException e) {
Log.e("camera", "failed to open front camera");
}
}
public CameraHandlerThread mThread = null;
public static class CameraHandlerThread extends HandlerThread {
Handler mHandler = null;
CameraHandlerThread() {
super("CameraHandlerThread");
start();
mHandler = new Handler(getLooper());
}
synchronized void notifyCameraOpened() {
notify();
}
public void openCamera() {
mHandler.post(new Runnable() {
#Override
public void run() {
oldOpenCamera();
notifyCameraOpened();
}
});
}
}
I converted onpreviewframe data to a video but after first second video doesn't play smoothly. what should I do ?
First, you're not forwarding the timing information with the frames:
mMediaCodec.queueInputBuffer(inputBufferIndex, 0, data.length, 0, 0)
So your BufferInfo.presentationTimeUs will always be zero when you dequeue the buffer.
Second, you don't appear to be using MediaMuxer, which means you're just writing raw the raw H.264 stream to a file. This is not ".mp4"; it doesn't include the timing information at all. Many video players don't even know what to do with plain H.264.
Wrapping the file as .mp4, with the frame timing from the camera, should yield better results.
Your code structure appears to be assuming that it can feed one frame of input and get one frame of output, which isn't always the case. You want to keep the input full, and drain the output as it becomes available.
You can find more information and some sample code on bigflake and in Grafika.
I'm trying to resize the captured image so that it can has a size smaller than 1Mb. This is what I'm tried so far in OnActivityResult but failed. I'm get the tutorial from Android take photo and resize it before saving on sd card
ImageFitScreen.java
try {
Bitmap bitmap;
BitmapFactory.Options bitmapOptions = new BitmapFactory.Options();
bitmapOptions.inJustDecodeBounds = false;
bitmapOptions.inPreferredConfig = Bitmap.Config.RGB_565;
bitmapOptions.inDither = true;
bitmap = BitmapFactory.decodeFile(f.getAbsolutePath(), bitmapOptions);
Global.img = bitmap;
b.setImageBitmap(bitmap);
String path = android.os.Environment.getExternalStorageDirectory() + File.separator + "Phoenix" + File.separator + "default";
//p = path;
f.delete();
OutputStream outFile = null;
File file = new File(path, String.valueOf(System.currentTimeMillis()) + ".png");
try {
outFile = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, outFile);
//pic=file;
outFile.flush();
outFile.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (Exception e) {
e.printStackTrace();
}
} catch (Exception e) {
e.printStackTrace();
}
}
Claims.java
button.setOnClickListener(new View.OnClickListener() {
public void onClick(View arg0) {
if ((name != null && name.trim().length() > 0) && (result != null && result.trim().length() > 0)) {
// Toast.makeText(getActivity().getApplicationContext(), fk+"", Toast.LENGTH_LONG).show();
byte[] data=getBitmapAsByteArray(getActivity(),Global.img);// this is a function
Toast.makeText(getActivity().getApplicationContext(), data+"", Toast.LENGTH_LONG).show();
if(data==null)
{
Toast.makeText(getActivity(), "null", Toast.LENGTH_LONG).show();
}
else
{
Toast.makeText(getActivity(), " not null", Toast.LENGTH_LONG).show();
SB.insertStaffBenefit(name, data, description, result, fk);
}
});
return claims;
}
public static byte[] getBitmapAsByteArray(final Context context,Bitmap bitmap) {
ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 0, outputStream);
Toast.makeText(context, outputStream.size()/1024+"KB", Toast.LENGTH_LONG).show();
return outputStream.toByteArray();
}
In Claims.java, it still display the size which is more than 1Mb
Can someone help me? Thanks
Check out the link below
How to resize Image in Android?
For better UI give toast to the user asking to upload image of specific size still he uploads larger image check it in backend or server and give negative response or prompt hi to again upload image of smaller size.