trying to get my first android widget and let me tell you moving from Visual Studio Pro to eclipse is not a smooth thing!
Anyway i run into the above error which gives me some trouble to solve. Ive searched for all possible duplicates and it seems that this error occurs when the app hits IPC size (ie. big images)
My small widget launches an intentservice to download some info from the web and one image. the original image size of the png is around 100k. However before i update widget I downscale to around 17k. (checked with memory tracker size was 16400bytes[]).
and yet the update fails. if i remove the image the update is successful.
this is the listener in widgetprovider which get executed:
public void onHttpLoaded(Bitmap image, String titlestring, String subtitlestring)
{
Context context = getApplicationContext();
if (image == null) //no data?
{
Toast.makeText(context, context.getResources().getString(R.string.null_data_toast), Toast.LENGTH_LONG);
return;// exit!
}
try
{
RemoteViews widgetView = new RemoteViews(this.getPackageName(), R.layout.main);
widgetView.setTextViewText(R.id.title, titlestring);
//all other comment out
//The problem!
widgetView.setImageViewBitmap(R.id.main_icon, image);
//Get global appwidgetmanager
AppWidgetManager manager = AppWidgetManager.getInstance(this);
//update widget
manager.updateAppWidget(Constants.THIS_APPWIDGET, widgetView);
}
catch (Exception e)
{
Log.d("onHttpLoaded", e.toString());
}
}
and the onHandleIntent from my service that the above code gets called:
protected void onHandleIntent(Intent intent)
{
resources = getApplicationContext().getResources();
int iAppWidgetId;
try
{
iAppWidgetId = intent.getIntExtra(AppWidgetManager.EXTRA_APPWIDGET_ID, AppWidgetManager.INVALID_APPWIDGET_ID);
}
catch (Exception e)
{
Log.d("Service", "WidgetID not found");
}
//comment out stuff...
//launch new httpTask
httpTask myhttpTask = new httpTask(tittlestring, subtitlestring, (myHttpListener) MyUpdateService.this, MyUpdateService.this);
//limit size of bitmap
myhttpTask.setSampleSize(Constants.BITMAP_SAMPLE_SIZE); //size = 4
myhttpTask.execute();
}
all tests are done in emulator.
one detail that may be important is that in logcat i get 20-30 fail messages "!!! failed binder transaction !!!"
any ideas on how i messed this up?
thnks!
solved. it was a memory problem of the emulator.
Related
I am developing a camerax app for the image analysis use case. I built it and ran on my phone and it worked perfectly. I also tried it on a few other phones and it worked on them too. We actually plan to use the app on our raspberry pi 3B which has lineageOS installed and has necessary cameras attached and we have also tried running other apps that use camera on it and it worked perfectly. But, when I ran our app on it, it started, showed the animation that it was supposed to show and nothing else happened, like the app didn't crash, we also noticed the camera turning on and off a few times, but that's it. From the logs, I could see the app never runs the code in the setAnalyzer() block. Here's the code for the relevant part.
model=new LocalModel.Builder()
.setAssetFilePath("model.tflite")
.build();
CustomImageLabelerOptions customImageLabelerOptions=new CustomImageLabelerOptions.Builder(model)
.setConfidenceThreshold(.98f)
.setMaxResultCount(1)
.build();
labeler= ImageLabeling.getClient(customImageLabelerOptions);
analysis=new ImageAnalysis.Builder().setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
.setTargetResolution(new Size(1280, 720))
.build();
Log.i(TAG, "before the analysis starts");
analysis.setAnalyzer(ContextCompat.getMainExecutor(this), image -> {
Log.i(TAG, "Analysis started");
int rotationDegrees = image.getImageInfo().getRotationDegrees();
#SuppressLint("UnsafeOptInUsageError") Image mediaImage=image.getImage();
if(mediaImage!=null){
InputImage i=InputImage.fromMediaImage(mediaImage, rotationDegrees);
labeler.process(i).addOnSuccessListener(new OnSuccessListener<List<ImageLabel>>() {
#Override
public void onSuccess(#NonNull #NotNull List<ImageLabel> imageLabels) {
Log.i(TAG, "successfully started to process the image");
String labels="";
for(ImageLabel label: imageLabels){
Log.i(TAG, label.getText());
labels=label.getText();
Log.i(TAG, labels);
}
if(labels.equalsIgnoreCase("dew")) {
isDetectingBottles=true;
d.stop();
imageView.setBackgroundResource(R.drawable.unnffffamed);
}
else{
if(isDetectingBottles){
isDetectingBottles=false;
imageView.setBackgroundResource(R.drawable.blink_animation);
d= (AnimationDrawable) imageView.getBackground();
d.start();
}
}
image.close();
}
}).addOnFailureListener(new OnFailureListener() {
#Override
public void onFailure(#NonNull #NotNull Exception e) {
Log.i(TAG, "ImageLabeling process failed");
e.printStackTrace();
image.close();
}
});
}
});
CameraSelector selector= CameraSelector.DEFAULT_BACK_CAMERA;
Log.i(TAG, "before binding to lifecycle");
ListenableFuture<ProcessCameraProvider> m=ProcessCameraProvider.getInstance(this);
m.addListener(new Runnable() {
#Override
public void run() {
try {
m.get().bindToLifecycle((LifecycleOwner) MainActivity.this, selector, analysis);
}
catch(InterruptedException e){
e.printStackTrace();
}
catch(ExecutionException e){
e.printStackTrace();
}
}
}, ContextCompat.getMainExecutor(this));
Here are a few thing I tried:
At the beginning I didn't set any target resolution and left it to camerax but it still didn't work back then.
I tried using different cameras with the selector, using the front camera leads to a crash.
Tried calling image.close() at the end of setAnalyzer block, but that's obviously not the issue since it works on other devices.
NOTE: the logs don't show any exceptions as for me to get an idea of what's happening.
Is there any solution to capture every SceneView in ARCore as array of bitmap or something like that?
I want to save every frame which is captured by ARCamera for processing them in the future.
The problem is that with the current code, I have a heavy performance load which first slows the application and then it leads to an application crash.
I'm new to android/ Java and I can't figure out where I'm wrong.
Aside from that, something else is surprising.
When I log the takePhoto() function inner block which is responsible for copying ARSenceview into bitmapsarray, difference of time stamps of every function call is not the same and equal to 1s/30frames, though it is very important for me to capture the views at a fix ratio.
When the application is in capturing mode, I call this function every frame. One possible error should because of high number of threads acquire here.
private void takePhoto()
{
view = arFragment.getArSceneView();
// NOTE: Create a handler thread to offload the processing of the image
final HandlerThread handlerThread = new HandlerThread("PixelCopier");
handlerThread.start();
// NOTE: Make request to copy
PixelCopy.request(view, bitmap, copyResult -> {
if(copyResult == PixelCopy.SUCCESS)
{
bitmapsBuffer[bufferIndex] = bitmap;
bufferIndex++;
if(bufferIndex == 120)
{
bufferIndex = 0;
FlushBitmapsBuffer();
}
}
else
{
messenger.message("Failed to copyPixel: " + copyResult);
}
handlerThread.quitSafely();
}, new Handler(handlerThread.getLooper()));
}
Then after collecting 120 frames I try to write the temporary array of bitmaps on SD using another service.
private void FlushBitmapsBuffer()
{
Intent intent = new Intent(MainActivity.this, DataCopyService.class);
intent.putExtra("BitmapsBuffer", bitmapsBuffer);
intent.putExtra("StorageDataRoot",
storageData.getRootDir());
intent.putExtra("StorageDataProj",
storageData.getProjectDir().getAbsolutePath());
startService(intent);
}
the service works as follow:
#Override
public int onStartCommand(Intent intent, int flags, int startId)
{
messenger.log("Service " + TAG + " started.");
bitmapsBuffer = (Bitmap[]) intent.getParcelableArrayExtra("BitmpasBuffer");
File file;
file = new File(intent.getStringExtra("StorageDataRoot"));
storageData.setRootDir(file);
file = new File(intent.getStringExtra("StorageDataProj"));
storageData.setProjectDir(file);
new Thread(new Runnable() {
#Override
public void run() {
messenger.log("Service inner thread.");
// Save the data here using a file manager class
stopSelf();
}
}).start();
return Service.START_STICKY;
}
Does someone have any idea about what I'm doing wrong.
Capturing frames using Sceneform is supported SceneView.startMirroringToSurface(). This will render the current view to a surface every frame.
It is used in the Video Recording Sample to capture a video.
I'm trying to pass a Bitmap from an Activity to an other, I tried multiple solutions but they are not fast enough.
Currently I'm facing this problem: When I click the next button it freezes for 2 seconds then move to the next Activity with the right Bitmap shown in the ImageView.
I found this solution in StackoverFlow. Here is the code:
Uri imageUri = intent.getParcelableExtra("URI");
if (imageUri != null) {
imageView.setImageURI(imageUri);
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
} catch (IOException e) {
e.printStackTrace();
}
} else {
Toast.makeText(this, "No image is set to show", Toast.LENGTH_LONG).show();
}
btn_next_process.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if (bitmap == null) {
Toast.makeText(CropResultActivity.this, "Emptyyy", Toast.LENGTH_SHORT).show();
}
else {
try {
//Write file
String filename = "bitmap.png";
FileOutputStream stream = CropResultActivity.this.openFileOutput(filename, Context.MODE_PRIVATE);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
//Cleanup
stream.close();
// bitmap.recycle();
//Pop intent
Intent in1 = new Intent(CropResultActivity.this, InputProcessingActivity.class);
in1.putExtra("image_data", filename);
startActivity(in1);
} catch (Exception e) {
e.printStackTrace();
}
}
}
});
Then I tried to save the file in a worker Thread first, and when I click the next button I retrieve it, now it's working fast but I am getting a wrong Bitmap.
Here is the code :
Uri imageUri = intent.getParcelableExtra("URI");
if (imageUri != null) {
imageView.setImageURI(imageUri);
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
new Thread() {
#Override
public void run() {
try {
//Write file
FileOutputStream stream = CropResultActivity.this.openFileOutput(filename, Context.MODE_PRIVATE);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
//Cleanup
stream.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}.run();
} catch (IOException e) {
e.printStackTrace();
}
} else {
Toast.makeText(this, "No image is set to show", Toast.LENGTH_LONG).show();
}
btn_next_process.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if (bitmap == null) {
Toast.makeText(CropResultActivity.this, "Empty", Toast.LENGTH_SHORT).show();
}
else {
//Pop intent
Intent in1 = new Intent(CropResultActivity.this, InputProcessingActivity.class);
in1.putExtra("image_data", filename);
startActivity(in1);
}
}
});
In the second Activity I retrieve the Bitmap this way :
private void getIncomingIntent(){
if(getIntent().hasExtra("image_data")){
try {
String filename = getIntent().getStringExtra("image_data");
FileInputStream is = this.openFileInput(filename);
imageToProcess = BitmapFactory.decodeStream(is);
process_detect_edges(imageToProcess);
is.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
What am I doing wrong?
Just pass the image uri to next activity. Load from the uri in other activity.
Create a trivial Service. Give that Service a public Bitmap mBitmap member.
Keep each Activity bound to the Service while they are between onStart() and onStop().
If your Activities have a reference to the Service, they can communicate directly via the mBitmap member. One Activity can set mBitmap, then start the other Activity. The second Activity can simply grab the reference (after binding, of course), and begin manipulating the Bitmap. Since everything happens on the UI thread, there are no synchronization concerns. And everything is quite fast.
This solution does not address problems of persistence: If the user leaves the app for a period of time (puts it the background, locks the screen, etc.), then the entire app may be destroyed, and mBitmap would be lost. However, if you're just trying to share data between two successive Activities, this is a straightforward way of doing it.
You could even share the Bitmap via a public static reference, placed in any convenient class. There are rumors that the garbage collector goes around setting static references to null at a whim, but this is a misinterpretation of the actual behavior: That an entire app may get cleaned up at an unexpected time. When you return to your Activity, the system may actually have to restart the app and recreate the Activity from scratch. In this case, the reference would be reset to null.
Instead, using a Service indicates to the OS that you have a component of your app that should be a little bit longer-lived. Certainly, it will continue to exist across the gap between two successive Activities.
Note that, on Oreo and later, the system can be quite aggressive about cleaning up apps as soon as they leave the foreground.
Just as the title say, is it possible to programmatically interact/configure with native Camera app on Android device?
For example, I am trying to have the native camera app to take a picture without having to physically touch the capture button. I'm not wanting to use internal-timer feature in the camera app since this requires physical touch to the button as well.
So simply put, I want to know if it is possible to take a picture without physically touching the capture button.
Thanks in advance for the reply.
is it possible to programmatically interact/configure with native Camera app on Android device?
Not from an ordinary Android app.
Also, please note that there are several thousand Android device models. There are hundreds of different "native Camera app" implementations across those device models, as device manufacturers often implement their own. Your question implies that you think that there is a single "native Camera app", which is not the case.
For an individual device model, or perhaps a closely related family of devices, with a rooted device, you might be able to work something out with simulated user input. However, the myriad of camera apps means that you would need different rules for each app.
Also, if you are only concerned about your own device, you can use the uiautomator test system to control third-party apps, but that requires a connection to your development machine, as the tests are run from there.
It is possible. Just forget about "native Camera app" and use Camera/Camera2 API directly.
Some time ago I tried to make a background service taking pictire periodically, detects face and measure eyes distance to prevent my little dougter watching tab too close, but this was fail because tab camera angle was too narrow to take all her face.
I posted part of this app here (this code use depricated Camera interface. It was replaced by Camera2 interface since API21):
public void onCreate() {
super.onCreate();
mContext = getApplicationContext();
surfaceTexture = new SurfaceTexture(0);
}
public void takePictire() {
Camera cam = openFrontCamera(mContext);
if (cam != null) {
try {
cam.setPreviewTexture(surfaceTexture);
cam.startPreview();
cam.takePicture(null, null, mPicture);
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't take picture!");
}
}
}
private static Camera.PictureCallback mPicture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
BitmapFactory.Options bfo = new BitmapFactory.Options();
bfo.inPreferredConfig = Bitmap.Config.RGB_565;
Bitmap bitmap = BitmapFactory.decodeStream(new ByteArrayInputStream(data), null, bfo);
// Eye distance detection here and saving data
camera.stopPreview();
camera.release();
}
};
/* Check if this device has a camera */
private static Camera openFrontCamera(Context context) {
try {
boolean hasCamera = context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA);
if (hasCamera) {
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e(LOG_TAG, "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't open front camera");
}
return null;
}
Some additional info. To use this code you app should have camera permission in AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
Yes, You can capture image without any user input also in background without frame. Take a look here . Hope this help you!
I'm making music player app with simple functionality. But when I listen music on my phone with Android 6, sometimes music stops playing until I turn on display again with power button. Then next song is playing, so it seems like it's problem with loading next song. I tried to write new app just to test it out, for this purpose I used this tutorial:
https://code.tutsplus.com/tutorials/background-audio-in-android-with-mediasessioncompat--cms-27030
To this example I added ArrayList with paths to songs. In mediaPlayer onCompletionListener I increase track counter and load new song to media player.
My code:
private void initMediaPlayer() {
mMediaPlayer = new MediaPlayer();
mMediaPlayer.setWakeMode(getApplicationContext(), PowerManager.PARTIAL_WAKE_LOCK);
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mMediaPlayer.setVolume(1.0f, 1.0f);
mMediaPlayer.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
#Override
public void onCompletion(MediaPlayer mediaPlayer)
{
onTrackCompletion();
}
});
private void onTrackCompletion()
{
NextTrack();
Play();
}
private void NextTrack()
{
playlistPosition++;
if (playlistPosition == playlists.get(playlistCurrent).size){
playlistPosition = 0;
}
sendAction(ACTION_TRACK_NEXT);
if(mMediaPlayer.isPlaying()){
Pause();
}
loadSong();
Play();
}
private void loadSong()
{
String path = playlists.get(playlistCurrent).getPath(playlistPosition);
if(path == null || path == "")
{
return;
}
try
{
try
{
mMediaPlayer.setDataSource(path);
} catch( IllegalStateException e ) {
mMediaPlayer.release();
initMediaPlayer();
mMediaPlayer.setDataSource(path);
}
initMediaSessionMetadata();
} catch (IOException e) {
return;
}
try {
mMediaPlayer.prepare();
} catch (IOException e) {}
sendTrackData();
}
I don't know anymore why this doesn't work. In manifest I have WAKE_LOCK permission. I also set wake lock for Media player.
Edit:
Today I tried to move loading song into onPlayFromMediaId. I made broadcast from AutoActivity where is Media player to Main Activity and send back onPlayFromMediaId with path to song. But seems like this doesn't work either.I also find out that changing volume with buttons also wake up app.
Edit2:
I made many tests and added debug string in many places in code. And I found out that app stops at mediaplayer.prepare() until I trigger any action on phone (turn on display, volume up/down, click headset button). But I don't know how to fix this bug. I tried to use prepareAsync, but didn't help.
Unless you use foreground service, the system will kill your process and mediaplayer will stop.
below is a part from from a foreground service ( notification example).
builder.setContentTitle(aMessage) // required
.setSmallIcon(R.mipmap.ic_launcher)
.setContentText(this.getString(R.string.app_name)) // required
.setAutoCancel(false)
.setContentIntent(pendingIntent)
.setVibrate(new long[]{0L})
.setPriority(Notification.PRIORITY_HIGH);
int mId = 1489;
startForeground(mId, builder.build());
The above code is tested and working fine.