I am currently using the code here, albeit heavily modified to suit my needs
https://stackoverflow.com/a/28186621/4541217
such as I need to take an image from the camera as well as select from the gallery. I am also zooming the image.
This all works nicely, except for one issue. I lose things when I rotate the device.
I have
bTemp = null;
if(getLastNonConfigurationInstance() != null) {
bTemp = getLastNonConfigurationInstance();
}
in my onCreate, and an override...
#Override
#Deprecated
public Object onRetainNonConfigurationInstance() {
return bTemp;
}
I can make this return the image but I lose all of my stroke information.
From the example, I have tried saving the Uri, the alteredBitmap, the bitmap and the choosenImageView. However, none of these are working. If I take a photo, scribble on it, then before doing anything else, using the alteredBitmap, if I rotate, then I get the first set of strokes. However, nothing after that.
Can anyone help me to keep my stroke information on rotate please?
Learn about the activity lifecycle.
You need to override functions like onPause, onResume, and use the savedInstanceState.
I managed to work it out eventually, so for anyone else that is trying to do the same, here is what I did.
Following on from the example link in my opening post, in order to make it stick while rotating...
in the onRetainNonConfigurationInstance, keep the alteredBitmap. (This is in the Activity)
#Override
#Deprecated
public Object onRetainNonConfigurationInstance() {
return alteredBitmap;
}
then, in the onCreate of the activity...
if(getLastNonConfigurationInstance() != null) {
bmp = (Bitmap)getLastNonConfigurationInstance();
alteredBitmap = Bitmap.createBitmap(bmp.getWidth(), bmp.getHeight(), bmp.getConfig());
choosenImageView.setNewImage(alteredBitmap, bmp);
}
notice that the "bmp" is what was sent from alteredBitmap, and now alteredBitmap is the new image. This is then passed into the setNewImage in the DrawableImageView.
Related
I am trying to create a simple android app using JAVA(Android Studio). What I am willing to implement is the background color of the app to change based on the content of the TextView (which is updated constantly between two values "Open" and "Closed"). I have seen that onChanged() must be used but I don`t really understand how to Implement the listener for the TextView.
I tried wrote this :
measurementValue = (TextView) findViewById(R.id.textbox_main); measurementValue.addTextChangedListener(new TextWatcher() {}
based on what I have found online, but can`t really figure out what the next step is. I am a total newbie so please be kind.
Best regards!
You can listen for text change on those 3 callbacks. It depends on your use case.
#Override
public void onTextChanged(CharSequence textInput, int start, int before, int count) {
String strInput = textInput.toString();
if ("open".equalsIgnoreCase(strInput)) {
// text is "open"
// change background to desired color
} else if ("closed".equalsIgnoreCase(strInput)) {
// text is "closed"
// change background to desired color
}
}
I have a video on my app and I'm trying to achieve that when the user rotates his device from portrait to landscape the video changes to full-screen.
I'm using OrientationEventListener like this:
orientationEventListener = new OrientationEventListener(this, SensorManager.SENSOR_DELAY_NORMAL) {
#Override
public void onOrientationChanged(int orientation) {
if (orientation <= 45 && playerManager.isFullscreen()) {
onPlayerFullscreenChange(false); //ORIENTATION_PORTRAIT
} else if (orientation <= 135 && !playerManager.isFullscreen()) {
onPlayerFullscreenChange(true); //ORIENTATION_LANDSCAPE
} else if (orientation <= 225 && playerManager.isFullscreen()) {
onPlayerFullscreenChange(false); //ORIENTATION_PORTRAIT
} else if (orientation <= 315 && !playerManager.isFullscreen()) {
onPlayerFullscreenChange(true); //ORIENTATION_LANDSCAPE
}
}
};
The problem is that this listener gets called so many times that my video can't play normally. The activity ends up going throw the OnCreate multiple times unlike before.
The onOrientationChanged(int orientation) method notifies you everytime the sensor detects a change in the way you are holding the Android device.
It is nearly impossible for a user to hold the device still, thus the onOrientationChanged() gets called multiple times.
What I have understood from your question is that, you only want to display a video in fullscreen when the user holds the device horizontally.
Thus consider angular values of 0-45, 135-255, and 315-360 as VERTICAL.
And, angles between 45-135 and 225-315 as HORIZONTAL.
This will make sense if you give it some thought.
Store the "previous orientation", which you can set to null initially for example, if you use String. Then compare it with the currently detected orientation, if they are not the same, take an action (set video to full screen or vice-versa) and save the current orientation values as "previous orientation".
This problem really comes down to better implementing your algorithm. All the best!
I'm currently developing an application on Android with AndroidPDFViewer : https://github.com/barteksc/AndroidPdfViewer
I want to create a functionality, when the user touch the screen, it put a point on the PDF at this location. After I want to measure the distance between 2 points but it's an other problem.
I don't understand how to do this functionality, put a point on PDF. I found this : https://github.com/barteksc/AndroidPdfViewer/issues/554
So it's possible but how ? I don't get it.
I suppose I need to create a bitmap, but I can't draw on the PDF, or put a marker.
Thanks for your time.
You have a sample file in his repo.
If you go to this location he has already created the method on how to load a pdf! Then if you go to this link you will see that he has created the class and has included good comments
Render page fragment on {#link Surface}
Page must be opened before rendering.
public void renderPage(PdfDocument doc, Surface surface, int pageIndex,
int startX, int startY, int drawSizeX, int drawSizeY) {
renderPage(doc, surface, pageIndex, startX, startY, drawSizeX, drawSizeY, false);
}
The one that you are searching is bookmark method
/** Get table of contents (bookmarks) for given document */
public List<PdfDocument.Bookmark> getTableOfContents(PdfDocument doc) {
synchronized (lock) {
List<PdfDocument.Bookmark> topLevel = new ArrayList<>();
Long first = nativeGetFirstChildBookmark(doc.mNativeDocPtr, null);
if (first != null) {
recursiveGetBookmark(topLevel, doc, first);
}
return topLevel;
}
}
However keep in mind that it might be necessary to use Async Tasks to download a pdf!
I am developing a chat application, where i have implemented image sending. So without the images everything works fine. But now I'm facing issues in displaying the images that are sent / received. For that what i've done is downloaded the image first and saved it in external storage. My idea was simple fetch the bitmap set it in an imageview on the getView method. Nothing wrong with fetching the image and displaying but listview ruins everthing.
This is my code-
ListView getView()
ViewHolder holder = null;
if(convertView == null ){
convertView = inflater.inflate(R.layout.chat_list, null);
holder = new ViewHolder();
holder.mMsg = (TextView) convertView.findViewById(R.id.text);
holder.mDate = (TextView) convertView.findViewById(R.id.text1);
holder.mLinLay=(LinearLayout) convertView.findViewById(R.id.linSide1);
holder.mRelLay=(RelativeLayout) convertView.findViewById(R.id.relSide);
holder.img=(ImageView)convertView.findViewById(R.id.image);
holder.progCircle=(ProgressBar)convertView.findViewById(R.id.pbHeaderProgress);
convertView.setTag(holder);
} else{
holder = (ViewHolder) convertView.getTag();
}
holder.position=position;
holder.img.setTag(position+"img");
HashMap<String, String> hm = list.get(position);
String date=hm.get("date");
long l = Long.parseLong(date);
long unixTime = System.currentTimeMillis();
CharSequence datedate=DateUtils.getRelativeTimeSpanString(l, unixTime, 1);
String status = hm.get("status");
String who = hm.get("who");
if(hm.get("message").contains("[[pic_message]]"))
{
String picName=hm.get("message").replace("[[pic_message]]", "");
final String picPath1=Environment.getExternalStorageDirectory()+"/App/sent_pics/"+picName+".jpg";
File file = new File(picPath);
if(file.exists())
{
new AsyncTask<ViewHolder, Void, Bitmap>() {
private ViewHolder v;
Bitmap bm;
#Override
protected Bitmap doInBackground(ViewHolder... params) {
v = params[0];
bm = decodeSampledBitmapFromResource(picPath1,100,100);
return bm;
}
#Override
protected void onPostExecute(Bitmap result) {
super.onPostExecute(result);
if (v.img.getTag().equals(position+"img")) {
v.img.setImageBitmap(result);
}
}
}.execute(holder);
}else{
holder.img.setImageBitmap(null);
}
}else{
holder.img.setImageBitmap(null);
Spannable msg =getSmiledText(getApplicationContext(),hm.get("message"));
holder.mMsg.setText(msg);
}
holder.mDate.setText(datedate);
return convertView;
What the problem is-
The listview shows up, but the images are displayed at random listview items. I searched a lot for this kind of problem, tried them but still no luck. The asyntask code is from Google- Use A Background Thread.. As you see from above i didn't set any holder.mMsg.setText(msg); but still whenever the images are shown they are accompanied by a msg, which is not desirable. Also the scrolling is kindda laggy when the images are shown. I have seen many messaging apps, they all run so smoothly, and the items are never mixed up. So what do they use.Also, while googling this problem, one answer suggested that we should remove the converView if else and then everything will be fine. But that will have performance issues which i don't want.
So my question is what should i do? How do the chat apps like, hangouts,bbm,whatsapp achieve this so smoothly?
EDIT
The pictures are displayed with no problem but the main problem is that they are diplayed at random orders.
Like this- i scroll till the image message, they are displayed. Then I again scroll and now the image is displayed in other listview item. My guess is that the same view(which originallly holds the image) is now being used by other, but the image stays there. How can i fix that ? And make the listview smooth (even with images) like the IM apps?
It's because Android ListView reuses Views, so while your Asynctask execute the donInBackground the user is already doing something else and you code don't know anything about it. (So it will put the image)
I would use Picasso to avoid the problem and let someone else handle your problem.
I handled this problem in my application (a contact phone) using:
A cache, to avoid to send multiple requests for the same image even when i already have in memory his image
I send a WeakReference<> to the Asynctask so when it's done check if the ImageView is still valid and put the bitmap inside it. Update the cache.
(Specific to me) Since sometimes i could happen that a contact don't have a photo i just created another List with all users without a photo (so i don't try to download it)
Everytime a .getview is called check if the image is inside the cache and if it's inside the cache put directly it, or start the asynctask.
Bitmap bitmap = cache.get(number);
if (bitmap == null) {
Log.d(TAG, "I will download image for " + name + ".");
// i set a dummy image to avoid to show the wrong picture
holder.contactPhoto.setImageResource(R.drawable.ic_action_person);
loadUserProfilePhoto(number, holder.contactPhoto);
}
else {
// show it directly
Log.d(TAG, "I already have " + name + " in cache.");
holder.contactPhoto.setImageBitmap(bitmap);
}
If you see, it could happen that it shows another picture instead of the user-photo while it's still loading it so i set a dummy picture instead of the old photo.
It's not the best implementation, but it's to show you that it could be very hard to implement it by yourself and do it right.
Read Picasso documentation and see how it could implement it in your code (it's simple, it's just one line.)
The issue is with listview which continuously keep refreshing it.
So your all code is fine, but just set default image in holder.img instead of making it null holder.img.setImageBitmap(null);
And may be in async too, you need set default_image while your image is being loaded.
If you want display images on the fly I think you should use SurfaceView. It is a lot faster and has a higher refresh rate. I made a video streaming application where I tried to display each video frame on the ImageView but that failed. But when I tried out SurfaceVIew it worked out fine.
So basically, i have this code,
if(mCamera.getParameters().getMaxNumDetectedFaces()==0)
{
System.out.println("Face detection not avaliable");
}
else
{
System.out.println("Max faces: " + Integer.toString(mCamera.getParameters().getMaxNumDetectedFaces()));
}
mCamera.setFaceDetectionListener(new FaceDetectionListener() {
#Override
public void onFaceDetection(Face[] faces, Camera camera) {
// TODO Auto-generated method stub
System.out.println("Face detection callback called." + Integer.toString(faces.length));
}
});
After calling mCamera.startFaceDetection();, the callback is called, everything works as normal. However, if I change cameras, the same code results in the callback never being called. The getMaxNumDetectedFaces, returns 35 for both cameras, so I assume its supported on the front camera. I can change the camera back and forth, calling this code each time, and it will work for the back camera but not the front one.
Is there anything else I might be doing wrong?
Is it possible that the quality of the camera that's not working (the front one, right?) Isn't accurate enough for the face detection to work? The camera's image may be too noisy for the face detector to work. There are lot of other variables that could be hindering this.
Also doing a search for front camera, it looks like the front camera's points may be mirrored. This is described in: http://developer.android.com/reference/android/hardware/Camera.Face.html
I hope this helps.
Is there a way to check if the camera is being read? Java has always had some issues in registering web cams etc.... Perhaps try to make sure you can see images with the webcam.
Btw, if you want any further help, we will need to know more about the code. library etc....
This code will return the id of your Front facing camera, for others you can change camera.CameraInfo:
private int findFrontFacingCamera() {
int cameraId = -1;
// Search for the front facing camera
int numberOfCameras = Camera.getNumberOfCameras();
for (int i = 0; i < numberOfCameras; i++) {
Camera.CameraInfo info = new Camera.CameraInfo();
Camera.getCameraInfo(i, info);
if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
Log.d("FaceDetector", "Camera found");
cameraId = i;
break;
}
}
return cameraId;
}
I had the code which worked on my Gallaxy tablet but it wouldnt call the take foto and as a result wouldnt call face detection in other devices, so after searching for a while I found this solution which worked. I added the following code in the class where takePicture is called :
camera.startPreview();
You can use Webcame for capturing image from webcam. it automatically detects webcam so no need to extra configuration for webcam. it also support more than one webcam at a time.