I have been trying to use the ML Kit Vision Quickstart Sample App to develop a blink detection app.
For this, I have been using the Face Detector module with the CameraX library.
On installing the sample app on my Android phone, I found that rotating the device with Screen Rotation locked makes the app unable to detect faces, as expected.
I want to know how I should modify the code to override the Screen Rotation lock on the device, so that the app automatically rotates its orientation to detect faces.
To override the screen lock you may add the next tag to your target activity in the manifest:
android:screenOrientation="sensor"
Although this will only allow to rotate the activity regardless the screen lock. You still need to handle the orientation changes for the camera code. What I mean is that your camera lens is a fixed device and it doesn't rotate, therefore you need to handle image rotation by code.
Related
I made an app that uses the onkeyevent functions to capture the bluetooth camera shutter buttons with up and down volumes. I am trying to use it for a camera app to take pictures when the screen is off.
However this no longer works if the screen is locked or off. The minute the screen is back on the function works again. So I know the bluetooth is still connected.
Can anyone with experience in this tell me if it's possible for the phone to capture bluetooth events when the screen is off. What changes should I make.
I want to add mirror effect in live camera preview screen.im using open camera library.can anyone help me to achieve this I'm stuck last couple of days.
I am wondering if it is possible to show the output of the camera running on your device immediately on the screen. Just like the native camera app does.
I have to show the picture, that comes into the camera lens, and additionally add some graphics overlays. That's why I guess, starting an Intent to open the camera activity is not suitable.
I've found some SO Threads, Tutorials and documentation about using the Android Camera API, but they are all able to just take a picture and display it afterwards.
Is it possible at all?
Refer to this link Camera Tutorial for Android (using surfaceview)!
Use the SurfaceView to preview the camera output, then, you can add your graphics overlays as you wish.
Hope this helps :)
I'am currently working on an Android application and I need to be able to :
Find square on a preview Camera. that part is working.
The problem that I have is, in order to be able to find these square on the Camera preview, I need to change my Camera resolution to 800x600 in order to have something smooth.
What I want is : As soon as I hit my button I want to capture the image in a specific resolution (1920x1080 for example). So basically the preview have to be in 800x600 but the capture have to be in 1920x1080.
I'am using openCV to process my camera view on live.
I have properly set up the Camera preview using SurfaceTexture and OpenGL.
After days of research with trial and error, I'm unable to blur the pixels, there seems to be very little documentation on this.
The desired effect is blurring and dimming the camera while opening DrawerLayout.
So far all the apps and guides that I've found blur a bitmap.
Can someone help me out or point me in the right direction?
You will have to use a fragment shader and apply your blurring filter over there. So for your use case you will have to keep one flag inside shader which determines whether blur has to be applied or not. Say when your drawer is about to open or starting to close you can set and unset that flag.
a simple google search will point you to many further readings on this topic.