i am using CameraView api 'com.otaliastudios:cameraview:2.6.4' for camera live preview , i want to turn on camera and according to document i can do it with CameraView by turning on but it does not works .
i tried to do it with CameraManager but it is not working either giving me exception that i can not use Flash light when camera is in use . i found some other similar question on stackoverflow . they seem to work but code is not making sense to me , i am not able to understand the code because code is not full .
do you have idea how can i do this , i have already defined permission AndroidManifest file for FLASHLIGHT and CAMERA .
Thank You In Advance .
I am facing the same issue. Try this solution, or read this blog: CameraX: Learn how to use CameraController
val cameraInstance : Camera = cameraProvider?.bindToLifecycle(viewLifecycleOwner, cameraSelector, previewUseCase)
private fun flashToggle() {
// My toggle flash function
val cameraController = cameraInstance?.cameraControl
if (cameraInstance?.cameraInfo?.torchState?.value == TorchState.ON) {
cameraController?.enableTorch(false)
binding.barscanfFlashToggle.setImageResource(R.drawable.ic_baseline_flash_off_24)
} else {
binding.barscanfFlashToggle.setImageResource(R.drawable.ic_baseline_flash_on_24)
cameraController?.enableTorch(false)
}
}
Related
I used video capturing code using built-in camera in an application, but it reporting issues some of android phones like OnePlus phones. in other phones it works perfectly, can anyone help me to solve this issue, i tried many times but it didn't work, I'm sharing my code below..
Acutually it is shows only in device like one plus, I try to use built-in cam for to recording video, in my project, After we click button for to record video, it will redirected to the built-in cam in the device, in other devices it's working perfectly. but in one plus version11 , after we click the button for to record video it will not redirected to built-in cam, it is just stuck with the same page.. this is the issue
private void captureVideo() {
Intent VideoIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
if (VideoIntent.resolveActivity(getPackageManager()) != null) {
VideoIntent.putExtra(MediaStore.EXTRA_VIDEO_QUALITY, 1);
VideoIntent.putExtra("android.intent.extras.CAMERA_FACING", 1);
VideoIntent.putExtra(MediaStore.EXTRA_DURATION_LIMIT, 3);
VideoIntent.putExtra(MediaStore.EXTRA_FINISH_ON_COMPLETION,true);
startActivityForResult(VideoIntent, VIDEO_REQUEST);
}
}
I am working on an app that requires me to set the exposure of the frame always from the centre of the preview image. I am working on CameraXin Android and I was wondering if there is a clean way of doing this?
P.s: Java Snippets are appreciated.
Thanks
Edit:
I have implemented a function for doing the same but because it's not activated by tap it's way harder to tell if my function is working correctly or is it normal autofocus and auto exposure.
The following is the code that I have implemented and because of the project restriction I am bound to use CameraX "1.0.0-alpha06" version:
private void setUpTapToFocus(CameraControl cameraControl) {
Log.v("Metering Point","inFunction");
//Can't use sensorOrientedMeteringPointFactory because of CameraX version
MeteringPointFactory factory = new SensorOrientedMeteringPointFactory(viewFinder.getWidth(),viewFinder.getHeight());
int centerWidth = viewFinder.getWidth()/2;
int centerHeight = viewFinder.getHeight()/2;
MeteringPoint point = factory.createPoint(centerWidth,centerHeight);
cameraControl.startFocusAndMetering(FocusMeteringAction.Builder.from(point).
.setAutoFocusCallback(isSuccess -> Log.v("SKAXC Point","Focused").
.setAutoCancelDuration(1,TimeUnit.SECONDS).build());
}
onCreate(){
CameraControl cameraControl = CameraX.getCameraControl(CameraX.LensFacing.BACK);
setUpTapToFocus(cameraControl);
}
I'm trying to develop a simple camera app with face detection and i'm using android-vision sample from here
https://github.com/googlesamples/android-vision/tree/master/visionSamples/FaceTracker
Everything is working fine and i need to add zoom in/out feature in it. I searched SO but found nothing related to vision. Every answer is related to Camera2.
You might try startSmoothZoom:
https://developer.android.com/reference/android/hardware/Camera.html#startSmoothZoom(int)
You'd need to modify the open source version of CameraSource to make this change, since you'd need access to its underlying android.hardware.Camera instance:
https://github.com/googlesamples/android-vision/blob/master/visionSamples/barcode-reader/app/src/main/java/com/google/android/gms/samples/vision/barcodereader/ui/camera/CameraSource.java#L121
Try this code, it works (Yes, it's reflection)
try {
cameraSource.apply {
start(holder)
javaClass.getDeclaredField("zzg").apply {
isAccessible = true
(get(cameraSource) as Camera).apply {
startSmoothZoom(min(5, parameters.maxZoom))
}
}
}
} catch (e: Throwable) {
Timber.e(e)
}
Notice, that zzg is an obfuscated var of Camera instance and it's name may be different per library releases
I am trying to record video through Kivy (http://kivy.org/#home) and am not sure what direction or libraries to use.
Currently I have the camera widget working with the code below, which gets the camera to display on the screen, but I am not sure how to get it to record and save the video file. Any help is greatly appreciated!
class MyApp(App):
# Function to take a screenshot
def doscreenshot(self,*largs):
Window.screenshot(name='screenshot%(counter)04d.jpg')
def build(self):
camwidget = Widget() #Create a camera Widget
cam = Camera() #Get the camera
cam=Camera(resolution=(640,480), size=(500,500))
cam.play=True #Start the camera
camwidget.add_widget(cam)
button=Button(text='screenshot',size_hint=(0.12,0.12))
button.bind(on_press=self.doscreenshot)
camwidget.add_widget(button) #Add button to Camera Widget
return camwidget
if __name__ == '__main__':
MyApp().run()
Kivy support only playing video / camera widget. There is nothing in the framework for encoding video and save it into a file.
Try to use directly gstreamer instead, maybe you'll have more chance.
I have a Android App I have developed, It has been created from a successful ios App that was released and works fine. On one device it works correctly, no errors and does not crash. However on another device (newer, newer version of Android, faster processor) it crashes. Here are the memory errors when it crashes.
The app is designed to take a picture and to then use that picture as the background for a canvas, then a screen shot is taken. I can go through this process once, but then if I repeat the process and take another picture, the application crashes on exit of the camera API for the second time. And displays these errors. The application is Developed in Eclipse using Phonegap and Jquery mobile. I am unsure which parts of my code I should post to help this problem, but please feel free to ask if you feel some may be relevant.
Any help is really appreciated.
Okay, if you review my post on why we run out of memory on PhoneGap Android apps:
http://simonmacdonald.blogspot.ca/2012/07/change-to-camera-code-in-phonegap-190.html
You may find some tips when you use the camera.getPicture() command. Other than that I'm currently working on fixing some of these Camera issues by bringing everything in house instead of firing a camera intent.
You can make such trick.... To set picture as background use...
public void setbg(){
String pathName = Environment.getExternalStorageState() + "scm_pic.jpg";
Resources res = getResources();
Bitmap bitmap = BitmapFactory.decodeFile(pathName);
BitmapDrawable bd = new BitmapDrawable(res, bitmap);
View view = findViewById(R.id.container);
view.setBackgroundDrawable(bd);
}
And then when you don't need this picture or want to #Override onPause methods or to switch another activity use...
private void unbindDrawables(View view) {
if (view.getBackground() != null) {
view.getBackground().setCallback(null);
}
if (view instanceof ViewGroup) {
for (int i = 0; i < ((ViewGroup) view).getChildCount(); i++) {
unbindDrawables(((ViewGroup) view).getChildAt(i));
}
((ViewGroup) view).removeAllViews();
}
}
To use it in #Override use for examole...
unbindDrawables(findViewById(R.id.container));
System.gc();
if you use camera means Destination type must in FILE_URL in android. because DATA_URI gives base64 string so it gives out of memory error sometime. First u get URL from camera then Convert base64 format using filesystem concepts...