I use Camera2 API for capturing OCR image in my App (Tablet only). Everything is work but there are one device (let say A Device) did'nt work. Here the log :
2021-05-28 13:46:23.402 28882-28937/com. D/OpenGLRenderer: textureCacheSize 75497472
2021-05-28 13:46:23.416 28882-28882/com. I/CameraManagerGlobal: Connecting to camera service
2021-05-28 13:46:33.303 28882-28975/com. W/System.err: java.lang.NullPointerException: Attempt to invoke virtual method 'android.media.Image$Plane[] android.media.Image.getPlanes()' on a null object reference
2021-05-28 13:46:33.305 28882-28975/com. W/System.err: at com.Camera2Fragment$6$1.run(Camera2Fragment.java:927)
2021-05-28 13:46:33.307 28882-28975/com. W/System.err: at android.os.Handler.handleCallback(Handler.java:873)
2021-05-28 13:46:33.308 28882-28975/com. W/System.err: at android.os.Handler.dispatchMessage(Handler.java:99)
2021-05-28 13:46:33.310 28882-28975/com. W/System.err: at android.os.Looper.loop(Looper.java:193)
2021-05-28 13:46:33.311 28882-28975/com. W/System.err: at android.os.HandlerThread.run(HandlerThread.java:65)
Here my CaptureSession.CaptureCallback :
CameraCaptureSession.CaptureCallback CaptureCallback = new CameraCaptureSession.CaptureCallback() {
#Override
public void onCaptureCompleted(#NonNull CameraCaptureSession session, #NonNull CaptureRequest request, #NonNull TotalCaptureResult result) {
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
#Override
public void run() {
Bitmap bitmap = null;
String nik;
try {
System.out.println("Array plane : " + mImageReader.acquireLatestImage().getPlanes().toString());
ByteBuffer buffer = mImageReader.acquireLatestImage().getPlanes()[0].getBuffer(); // Error start from here, in other device, acquireLatestImage is not null, but in A device is become null
byte[] data = new byte[buffer.remaining()];
buffer.get(data);
bitmap = BitmapFactory.decodeByteArray(data, 0, data.length);
} catch (NullPointerException e) {
// **In one device, its fall to here**
e.printStackTrace();
}
if (bitmap == null) {
Toast.makeText(getActivity(), "Failed to get image, please try again", Toast.LENGTH_SHORT).show();
} else {
// other my code (proccessing image, ocr and other)
}
}
}
}
}
Extra :
The A device have screen specification 10 inch and 800x1280 and 8MP. I dont know its take a effect for the surface but its only happen in this device.
I have read some issues about getting an image from Camera2 API, but actually my code work on many many devices.
Related
I am downloading photos to smartphone. For versions lower than Oreo, there's no problem. But for Oreo, my code isn't not working. I tried this code in Emulator:
I implemented a function to save an image to external storage.
private void saveImageToExternalStorage(Bitmap finalBitmap,String name) {
String root = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES).toString();
File myDir = new File(root + "/xx");
myDir.mkdirs();
String fname = name + ".jpg";
File file = new File(myDir, fname);
if (file.exists())
file.delete();
try {
FileOutputStream out = new FileOutputStream(file);
finalBitmap.compress(Bitmap.CompressFormat.JPEG, 90, out);
out.flush();
out.close();
}
catch (Exception e) {
e.printStackTrace();
}
// Tell the media scanner about the new file so that it is
// immediately available to the user.
MediaScannerConnection.scanFile(this, new String[] { file.toString() }, null,
new MediaScannerConnection.OnScanCompletedListener() {
public void onScanCompleted(String path, Uri uri) {
Log.i("ExternalStorage", "Scanned " + path + ":");
Log.i("ExternalStorage", "-> uri=" + uri);
}
});
}
I am requesting permissions with dexter library.
Dexter.withActivity(MainActivity.this)
.withPermission(Manifest.permission.WRITE_EXTERNAL_STORAGE)
.withListener(new PermissionListener() {
#Override
public void onPermissionGranted(PermissionGrantedResponse response) {
SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(MainActivity.this);
if (!prefs.getBoolean("firstTime", false)) {
task.execute();
SharedPreferences.Editor editor = prefs.edit();
editor.putBoolean("firstTime", true);
editor.commit();
}
}
#Override
public void onPermissionDenied(PermissionDeniedResponse response) {
Toast.makeText(MainActivity.this, "You need to allow permission if you want to use camera", Toast.LENGTH_LONG).show();
}
#Override
public void onPermissionRationaleShouldBeShown(PermissionRequest permission, PermissionToken token) {
token.continuePermissionRequest();
Toast.makeText(MainActivity.this, "You need to allow permission if you want to use camera", Toast.LENGTH_LONG).show();
}
}).check();
I save images with asynctask
final AsyncTask<Void, Void, Void> task = new AsyncTask<Void, Void, Void>() {
private ProgressDialog dialog;
#Override
protected void onPreExecute()
{
this.dialog = new ProgressDialog(MainActivity.this);
this.dialog.setMessage(getString(R.string.newfeature));
this.dialog.setCancelable(false);
this.dialog.setOnCancelListener(new DialogInterface.OnCancelListener()
{
#Override
public void onCancel(DialogInterface dialog)
{
// cancel AsyncTask
cancel(false);
}
});
this.dialog.show();
}
#Override
protected Void doInBackground(Void... params)
{
// do your stuff
Bitmap myBitmap2 = BitmapFactory.decodeResource(getApplicationContext().getResources(), R.drawable.im2);
saveImageToExternalStorage(myBitmap2,"imag2");
myBitmap2.recycle();
return null;
}
#Override
protected void onPostExecute(Void result)
{
//called on ui thread
if (this.dialog != null) {
this.dialog.dismiss();
}
}
#Override
protected void onCancelled()
{
//called on ui thread
if (this.dialog != null) {
this.dialog.dismiss();
}
}
};
I can see Storage permission is granted when I look Settings --> Apps for my app. But images are not saved correctly. In fact images are saved but all of them is green square like this.
As a result, it gives permission denied error although permission is granted.
09-21 13:11:08.023 17636-17765/xx.xx W/System.err: java.io.FileNotFoundException: /storage/emulated/0/Pictures/xx/imag2.jpg (Permission denied)
09-21 13:11:08.024 17636-17765/xx.xx W/System.err: at java.io.FileOutputStream.open0(Native Method)
09-21 13:11:08.024 17636-17765/xx.xx W/System.err: at java.io.FileOutputStream.open(FileOutputStream.java:308)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at java.io.FileOutputStream.<init>(FileOutputStream.java:238)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at java.io.FileOutputStream.<init>(FileOutputStream.java:180)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at xx.xx.MainActivity.saveImageToExternalStorage(MainActivity.java:804)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at xx.xx.MainActivity.access$000(MainActivity.java:62)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at xx.xx.MainActivity$1.doInBackground(MainActivity.java:119)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at xx.xx.MainActivity$1.doInBackground(MainActivity.java:89)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at android.os.AsyncTask$2.call(AsyncTask.java:333)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at java.util.concurrent.FutureTask.run(FutureTask.java:266)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:245)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1167)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:641)
09-21 13:11:08.033 17636-17765/xx.xx W/System.err: at java.lang.Thread.run(Thread.java:764)
Access Sd-Card's files
Use DOCUMENT_TREE dialog to get sd-card's Uri.
Inform user about how to choose sd-card on the dialog. (with pictures or gif animations)
// call for document tree dialog
Intent intent = new Intent(Intent.ACTION_OPEN_DOCUMENT_TREE);
startActivityForResult(intent, REQUEST_CODE_OPEN_DOCUMENT_TREE);
On onActivityResult you'll have the selected directory Uri. (sdCardUri)
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
switch (requestCode) {
case REQUEST_CODE_OPEN_DOCUMENT_TREE:
if (resultCode == Activity.RESULT_OK) {
sdCardUri = data.getData();
}
break;
}
}
Now must check if the user,
a. selected the sd-card
b. selected the sd-card that our file is on (some devices could have multiple sd-cards).
We check both a and b by finding the file through the hierarchy, from sd root to our file. If file found, both of a and b conditions are acquired.
//First we get `DocumentFile` from the `TreeUri` which in our case is `sdCardUri`.
DocumentFile documentFile = DocumentFile.fromTreeUri(this, sdCardUri);
//Then we split file path into array of strings.
//ex: parts:{"", "storage", "extSdCard", "MyFolder", "MyFolder", "myImage.jpg"}
// There is a reason for having two similar names "MyFolder" in
//my exmple file path to show you similarity in names in a path will not
//distract our hiarchy search that is provided below.
String[] parts = (file.getPath()).split("\\/");
// findFile method will search documentFile for the first file
// with the expected `DisplayName`
// We skip first three items because we are already on it.(sdCardUri = /storage/extSdCard)
for (int i = 3; i < parts.length; i++) {
if (documentFile != null) {
documentFile = documentFile.findFile(parts[i]);
}
}
if (documentFile == null) {
// File not found on tree search
// User selected a wrong directory as the sd-card
// Here must inform user about how to get the correct sd-card
// and invoke file chooser dialog again.
} else {
// File found on sd-card and it is a correct sd-card directory
// save this path as a root for sd-card on your database(SQLite, XML, txt,...)
// Now do whatever you like to do with documentFile.
// Here I do deletion to provide an example.
if (documentFile.delete()) {// if delete file succeed
// Remove information related to your media from ContentResolver,
// which documentFile.delete() didn't do the trick for me.
// Must do it otherwise you will end up with showing an empty
// ImageView if you are getting your URLs from MediaStore.
//
Uri mediaContentUri = ContentUris.withAppendedId(
MediaStore.Images.Media.EXTERNAL_CONTENT_URI,
longMediaId);
getContentResolver().delete(mediaContentUri , null, null);
}
}
Note:
You must provide access permission to the external storage inside your manifest and for os>=Marshmallow inside the app.
https://stackoverflow.com/a/32175771/2123400
Edit Sd-Card's Files
For editing an existing image on your sd-card you don't need any of above steps if you want to invoke another app to do it for you.
Here we invoke all the activities (from all the installed apps) with the capability of editing the images. (Programmers mark their apps in the manifest for its capabilities to provide accessibility from other apps (activities)).
on your editButton click event:
String mimeType = getMimeTypeFromMediaContentUri(mediaContentUri);
startActivityForResult(Intent.createChooser(new Intent(Intent.ACTION_EDIT).setDataAndType(mediaContentUri, mimeType).putExtra(Intent.EXTRA_STREAM, mediaContentUri).addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION), "Edit"), REQUEST_CODE_SHARE_EDIT_SET_AS_INTENT);
and this is how to get mimeType:
public String getMimeTypeFromMediaContentUri(Uri uri) {
String mimeType;
if (uri.getScheme().equals(ContentResolver.SCHEME_CONTENT)) {
ContentResolver cr = getContentResolver();
mimeType = cr.getType(uri);
} else {
String fileExtension = MimeTypeMap.getFileExtensionFromUrl(uri
.toString());
mimeType = MimeTypeMap.getSingleton().getMimeTypeFromExtension(
fileExtension.toLowerCase());
}
return mimeType;
}
Note:
On Android KitKat(4.4) don't ask the user to select the sd-card because on this version of Android DocumentProvideris not applicable, hence we have no chance to have access to the sd-card with this approach.
Look at the API level for the DocumentProvider
https://developer.android.com/reference/android/provider/DocumentsProvider.html
I couldn't find anything that works on Android KitKat(4.4). If you found anything useful with KitKat please share with us.
On versions below the KitKat access to sd-card is already provided by the OS.
I am creating an Android app and need a feature that takes photos without user interaction.
I simply want a class, for example, 'CameraService.java' that has a constructor that takes camera settings (eg. quality, res, etc) and a public function called 'takePhoto' which returns the image via type bitmap. I have been searching for a while to find out how I can do this using the Camera API 2 but have failed every time.
Most of the exemplars for doing this requires the camera to be made inside the MainActivity or in a class that extends Activity (I want to try to avoid this).
CameraService.java
I have utilized the following code found in a StackOverflow post (referenced below) which only saves the image to a hardcoded location (and uses Camera API 1) but I am experiencing errors.
package me.sam.camtest;
import android.hardware.Camera;
import android.util.Log;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
public class CameraService {
Camera mCamera;
private int quality;
public CameraService(int quality){
this.quality = quality;
}
private int findBackFacingCamera() {
int cameraId = -1;
int numberOfCameras = Camera.getNumberOfCameras();
for (int i = 0; i < numberOfCameras; i++) {
Camera.CameraInfo info = new Camera.CameraInfo();
Camera.getCameraInfo(i, info);
if (info.facing == Camera.CameraInfo.CAMERA_FACING_BACK) {
cameraId = i;
break;
}
}
return cameraId;
}
private boolean safeCameraOpen(int id) {
boolean qOpened = false;
try {
releaseCamera();
mCamera = Camera.open(id);
qOpened = (mCamera != null);
} catch (Exception e) {
e.printStackTrace();
}
return qOpened;
}
private void releaseCamera() {
if (mCamera != null) {
mCamera.release();
mCamera = null;
}
}
Camera.PictureCallback mCall = new Camera.PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
outStream = new FileOutputStream("/sdcard/Image.jpg");
outStream.write(data);
outStream.close();
} catch (FileNotFoundException e){
Log.d("CAMERA", e.getMessage());
} catch (IOException e){
Log.d("CAMERA", e.getMessage());
}
}
};
public void takePhoto(){
// Should return bitmap in future
int back_cam = findBackFacingCamera();
if(back_cam != -1){
safeCameraOpen(1);
mCamera.startPreview();
mCamera.takePicture(null, null, mCall);
}
}
}
Runtime Error:
E/AndroidRuntime: FATAL EXCEPTION: main
Process: me.sam.camtest, PID: 985
java.lang.RuntimeException: Unable to start activity ComponentInfo{me.sam.camtest/me.sam.camtest.MainActivity}: java.lang.RuntimeException: takePicture failed
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2778)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2856)
at android.app.ActivityThread.-wrap11(Unknown Source:0)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1589)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6494)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:440)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)
Caused by: java.lang.RuntimeException: takePicture failed
at android.hardware.Camera.native_takePicture(Native Method)
at android.hardware.Camera.takePicture(Camera.java:1588)
at android.hardware.Camera.takePicture(Camera.java:1530)
at me.sam.camtest.CameraService.takePhoto(CameraService.java:74)
at me.sam.camtest.Reply.<init>(Reply.java:47)
at me.sam.camtest.MainActivity.onCreate(MainActivity.java:80)
at android.app.Activity.performCreate(Activity.java:6999)
at android.app.Activity.performCreate(Activity.java:6990)
at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1214)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2731)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2856)
at android.app.ActivityThread.-wrap11(Unknown Source:0)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1589)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loop(Looper.java:164)
at android.app.ActivityThread.main(ActivityThread.java:6494)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:440)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)
I researched this error and it seemed that people could only fix it by adding a Surface View which required the CameraService to be an activity or passed a context (not what I want).
In conclusion, How could I achieve the following:
Have a simple camera class that has one public method, "takePhoto" which
returns a bitmap
Avoid extending Activity or passing contexts in the simple camera class
Take the photo without user interaction (eg: intents)
PS: I am very new to the StackOverFlow community, I have tried to follow the guidelines with my best ability, please don't hate too much :)
My research:
https://androidmyway.wordpress.com/2012/09/07/capture-image/
How to take pictures in android application without the user Interface..?
http://www.41post.com/3794/programming/android-take-a-picture-without-displaying-a-preview
I am having one of these mysterious android' issues (at least from my point of view). My app manages the device's camera via the camera2 api library. In my case I have two surfaces, one of them coming from an Image Reader object. Next I define my capture session and set those surfaces as targets. As you can see in the code below, I am following the typical workflows in such cases:
// Create ImageReader Surface
int max = 2;
mReader = ImageReader.newInstance(mWidth, mHeight, ImageFormat.YV12, max);
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader mReader) {
Image image = null;
image = mReader.acquireLatestImage();
if (image == null) {
return;
}
byte[] bytes = convertYUV420ToNV21(image);
nativeVideoFrame(bytes);
image.close();
}
};
if (OPENGL_SOURCE==2){
nativeVideoInit(mWidth, mHeight, 0, false);
}
mReader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
// Create Texture Surface
texture = createTexture();
mSurfaceTexture = new SurfaceTexture(texture);
mSurfaceTexture.setOnFrameAvailableListener(this);
mSurfaceTexture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
mSurface = new Surface(mSurfaceTexture);
//Attach surfaces to CaptureRequest
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(mReader.getSurface());
outputSurfaces.add(mSurface);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(mSurface);
captureRequestBuilder.addTarget(mReader.getSurface());
//Define the capture request
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
The thing is that I am not having any problem running this code on my tablet Samsung TAB A. However, when trying in my Nexus 5X or my friend's Samsung S6, the app crashes dramatically throwing this error:
08-23 11:28:51.772: E/AndroidRuntime(20315): FATAL EXCEPTION: main
08-23 11:28:51.772: E/AndroidRuntime(20315): Process: com.example.opengltest, PID: 20315
08-23 11:28:51.772: E/AndroidRuntime(20315): java.lang.IllegalArgumentException: Bad argument passed to camera service
08-23 11:28:51.772: E/AndroidRuntime(20315): at android.hardware.camera2.utils.CameraBinderDecorator.throwOnError(CameraBinderDecorator.java:114)
Doing some test, I found that the problem comes from the image reader surface. If I get rid off on this surface from the capture session's settings, the code runs seamlessly.
Why is this happening only on my Nexus 5x or Samsung S6 and not in my tablet? And how can I fix it?
Thanks,
JM
If you look at the whole system logcat, the camera service should have a
more detailed line that states why your output surface set is bad.
However, YV12 is not a format that's guaranteed to be supported, so it's probably just that. Some devices support it, some don't.
The only YUV format that's guaranteed to be supported by all devices is ImageFormat.YUV_420_888.
If you want to use YV12 when possible, you'll need to check the StreamConfigurationMap.getOutputFormats() list to see if it's listed before trying to use it. But you'll still need to fall back to YUV_420_888 on many devices, so it's simplest to just support that directly and nothing else.
I want to upload images to fto using simple FTP. But it always failed to upload to FTP. The error always points to the ftp.connect. I don't know why. So I hope u can help me.
Upload.java
public class ImageUpdate extends AppCompatActivity {
private static final String TAG_ID = "id";
private static final String TAG_PESAN = "message";
private static final String TAG_HASIL = "result";
private static final String TAG_IMAGE_ID = "id_image";
private static final String TAG_IMAGE_NAME= "image_name";
ProgressDialog pDialog;
JSONParser jparser = new JSONParser();
ArrayList<HashMap<String, String>> namelist, idList, imageList;
JSONArray names, names1, names2;
private static int RESULT_LOAD_IMG = 1;
String imgDecodableString = null;
Button submit;
static final String FTP_HOST = "xxxxxxxxxx";
static final String FTP_USER = "xxxxxxxxxxxx";
static final String FTP_PASS = "xxxxxxxxxxx";
String name, vid;
/**
* ATTENTION: This was auto-generated to implement the App Indexing API.
* See https://g.co/AppIndexing/AndroidStudio for more information.
*/
private GoogleApiClient client2;
SessionManagement session;
String nm,addr,pos,tlp,mail,usr,pass,id,image;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_image_update);
getSupportActionBar().setDisplayHomeAsUpEnabled(true);
getSupportActionBar().setHomeButtonEnabled(true);
client2 = new GoogleApiClient.Builder(this).addApi(AppIndex.API).build();
session =new SessionManagement(ImageUpdate.this);
HashMap<String, String> user = session.getUserDetails();
id=user.get(SessionManagement.KEY_ID);
nm=user.get(SessionManagement.KEY_NAME);
addr=user.get(SessionManagement.KEY_ALAMAT);
mail=user.get(SessionManagement.KEY_EMAIL);
tlp=user.get(SessionManagement.KEY_TELP);
usr=user.get(SessionManagement.KEY_USERNAME);
pass=user.get(SessionManagement.KEY_PASS);
submit=(Button) findViewById(R.id.buttonUploadPicture);
submit.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View v) {
if (imgDecodableString == null) {
Toast.makeText(ImageUpdate.this, "Choose image first, please", Toast.LENGTH_LONG);
} else {
File f = new File(imgDecodableString);
name = f.getName();
uploadFile(f);
}
}
});
}
public void loadImagefromGallery(View view) {
// Create intent to Open Image applications like Gallery, Google Photos
Intent galleryIntent = new Intent(Intent.ACTION_PICK,
MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
// Start the Intent
startActivityForResult(galleryIntent, RESULT_LOAD_IMG);
}
#Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
try {
// When an Image is picked
if (requestCode == RESULT_LOAD_IMG && resultCode == RESULT_OK
&& null != data) {
// Get the Image from data
Uri selectedImage = data.getData();
String[] filePathColumn = {MediaStore.Images.Media.DATA};
// Get the cursor
Cursor cursor = getContentResolver().query(selectedImage,
filePathColumn, null, null, null);
// Move to first row
cursor.moveToFirst();
int columnIndex = cursor.getColumnIndex(filePathColumn[0]);
String filename = cursor.getString(columnIndex);
imgDecodableString = cursor.getString(columnIndex);
cursor.close();
File f = new File("" + imgDecodableString);
f.getName();
ImageView imgView = (ImageView) findViewById(R.id.imgView);
// Set the Image in ImageView after decoding the String
imgView.setImageBitmap(BitmapFactory
.decodeFile(imgDecodableString));
} else {
Toast.makeText(this, "Pilih Bukti Transaksi",
Toast.LENGTH_LONG).show();
}
} catch (Exception e) {
Toast.makeText(this, "Failed to Choose", Toast.LENGTH_LONG)
.show();
}
}
public void uploadFile(File fileName) {
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitAll().build();
StrictMode.setThreadPolicy(policy);
SimpleFTP ftp=new SimpleFTP();
try {
ftp.connect("xxxxxxx", 21, "xxxxxxxx", "xxxxxxx");
ftp.bin();
ftp.cwd("img/imageProfil/");
ftp.stor(fileName);
ftp.disconnect();
} catch (Exception e) {
e.printStackTrace();
try {
ftp.disconnect();
Toast.makeText(ImageUpdate.this, "disconnect", Toast.LENGTH_LONG).show();
} catch (Exception e2) {
e2.printStackTrace();
Toast.makeText(ImageUpdate.this, "failed", Toast.LENGTH_LONG).show();
}
}
}
#Override
public void onStart() {
super.onStart();
// ATTENTION: This was auto-generated to implement the App Indexing API.
// See https://g.co/AppIndexing/AndroidStudio for more information.
client2.connect();
Action viewAction = Action.newAction(
Action.TYPE_VIEW, // TODO: choose an action type.
"Upload Page", // TODO: Define a title for the content shown.
// TODO: If you have web page content that matches this app activity's content,
// make sure this auto-generated web page URL is correct.
// Otherwise, set the URL to null.
Uri.parse("http://host/path"),
// TODO: Make sure this auto-generated app deep link URI is correct.
Uri.parse("android-app://com.amobi.newlomapodfix/http/host/path")
);
AppIndex.AppIndexApi.start(client2, viewAction);
}
#Override
public void onStop() {
super.onStop();
// ATTENTION: This was auto-generated to implement the App Indexing API.
// See https://g.co/AppIndexing/AndroidStudio for more information.
Action viewAction = Action.newAction(
Action.TYPE_VIEW, // TODO: choose an action type.
"Upload Page", // TODO: Define a title for the content shown.
// TODO: If you have web page content that matches this app activity's content,
// make sure this auto-generated web page URL is correct.
// Otherwise, set the URL to null.
Uri.parse("http://host/path"),
// TODO: Make sure this auto-generated app deep link URI is correct.
Uri.parse("android-app://com.amobi.newlomapodfix/http/host/path")
);
AppIndex.AppIndexApi.end(client2, viewAction);
client2.disconnect();
}
}
stackTrace
06-20 22:32:00.692 2845-2845/com.amobi.newlomapodfix W/EGL_emulation: eglSurfaceAttrib not implemented
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: java.io.IOException: SimpleFTP received an unknown response when connecting to the FTP server: 220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at org.jibble.simpleftp.SimpleFTP.connect(SimpleFTP.java:74)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at com.amobi.newlomapodfix.UploadActivity.uploadFile(UploadActivity.java:167)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at com.amobi.newlomapodfix.UploadActivity$1.onClick(UploadActivity.java:100)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at android.view.View.performClick(View.java:4204)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at android.view.View$PerformClick.run(View.java:17355)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at android.os.Handler.handleCallback(Handler.java:725)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at android.os.Handler.dispatchMessage(Handler.java:92)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at android.os.Looper.loop(Looper.java:137)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at android.app.ActivityThread.main(ActivityThread.java:5041)
06-20 22:32:04.900 2845-2845/com.amobi.newlomapodfix W/System.err: at java.lang.reflect.Method.invokeNative(Native Method)
06-20 22:32:04.904 2845-2845/com.amobi.newlomapodfix W/System.err: at java.lang.reflect.Method.invoke(Method.java:511)
06-20 22:32:04.908 2845-2845/com.amobi.newlomapodfix W/System.err: at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:793)
06-20 22:32:04.912 2845-2845/com.amobi.newlomapodfix W/System.err: at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:560)
06-20 22:32:04.912 2845-2845/com.amobi.newlomapodfix W/System.err: at dalvik.system.NativeStart.main(Native Method)
This answer explains a nonconform FTP specification in simple FTP, in fact, the server should start with 220 but this library gets an exception.
(https://stackoverflow.com/a/24386510/6093353).
This tutorial implements an easy FTP upload, try to follow this
http://androidexample.com/FTP_File_Upload_From_Sdcard_to_server/index.php?view=article_discription&aid=98
I have implemented an android camera with the help of official developer.android.com tutorial. The app is working fine sometimes but about 3/5 of times the preview of the camera freezes after some rotation and clicking the buttons or even without these works (other elements don't freeze). The cutest part is that when I debug the application the preview doesn't stuck but when I want to run the app normally sometimes the problem happens.
Here is my fullScreen Class which is consist of the codes that android studio generated for fullScreen activity and the codes for implementing surfaceHolder.Callback and camera stuff.
public class CameraActivity extends Activity implements SurfaceHolder.Callback {
... // some constants here
private Camera mCamera;
private SurfaceHolder mHolder;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
SurfaceView cameraSufaceView = (SurfaceView) findViewById(R.id.camera_preview);
// Accessing front camera to take picture
mCamera = openFrontFacingCameraGingerbread();
if (mCamera != null) {
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = cameraSufaceView.getHolder();
mHolder.addCallback(this);
} else {
// Alter user
}
/**
* Gets an instance of front facing camera if available
*/
#SuppressWarnings("deprecation")
private Camera openFrontFacingCameraGingerbread() {
int cameraCount;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e(TAG, "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
#Override
public void surfaceCreated(SurfaceHolder holder) {
// The Surface has been created, now tell the camera where to draw the preview.
try {
mCamera.setPreviewDisplay(holder);
mCamera.startPreview();
} catch (IOException e) {
Log.d(TAG, "Error setting camera preview: " + e.getMessage());
}
}
#Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null) {
// preview surface does not exist
return;
}
// stop preview before making changes
try {
mCamera.stopPreview();
} catch (Exception e) {
// ignore: tried to stop a non-existent preview
}
// start preview with new settings
try {
// mCamera.setDisplayOrientation(needs degree here);
mCamera.setPreviewDisplay(mHolder);
mCamera.startPreview();
} catch (Exception e) {
Log.d(TAG, "Error starting camera preview: " + e.getMessage());
}
}
#Override
public void surfaceDestroyed(SurfaceHolder holder) {
releaseCamera();
}
#Override
protected void onPause() {
super.onPause();
releaseCamera();
}
#Override
protected void onResume() {
super.onResume();
if (mCamera == null)
mCamera = openFrontFacingCameraGingerbread();
}
private void releaseCamera() {
if (mCamera != null) {
mCamera.stopPreview();
mCamera.release(); // release the camera for other applications
mCamera = null;
}
}
When this problem happens some errors appear in the LogCat but they are not really informative. here are they:
26893-26893/com.naviiid.retinaflash E/art﹕ No implementation found for void java.lang.Runtime.appStartupEnd() (tried Java_java_lang_Runtime_appStartupEnd and Java_java_lang_Runtime_appStartupEnd__)
09-16 01:34:09.336 26893-26893/com.naviiid.retinaflash E/ActivityThread﹕ appStartupEnd :
java.lang.reflect.InvocationTargetException
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java)
at android.app.ActivityThread.appStartupEnd(ActivityThread.java:305)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2819)
at android.app.ActivityThread.access$900(ActivityThread.java:177)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1448)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:145)
at android.app.ActivityThread.main(ActivityThread.java:5942)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java)
Caused by: java.lang.UnsatisfiedLinkError: No implementation found for void java.lang.Runtime.appStartupEnd() (tried Java_java_lang_Runtime_appStartupEnd and Java_java_lang_Runtime_appStartupEnd__)
at java.lang.Runtime.appStartupEnd(Native Method)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java)
at android.app.ActivityThread.appStartupEnd(ActivityThread.java:305)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2819)
at android.app.ActivityThread.access$900(ActivityThread.java:177)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1448)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:145)
at android.app.ActivityThread.main(ActivityThread.java:5942)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java)
The only clue that I have found so far is that sometimes android calls onPause method even if the activity isn't paused. For getting instance of the camera again I call openFrontFacingCameraGingerbread() in onResume method.