Force close for Android Custom Camera app - java

I am developing an android custom camera application without using the intent (to avoid getting android's built in camera features). I have enabled auto focus feature in my app. I am taking the picture on press of a keyVolume Button. And I am using the below code for setting the parameters.
Camera.Parameters p = camera.getParameters();
camera.autoFocus(autoFocusCallback);
p.setFocusMode(Parameters.FOCUS_MODE_AUTO);
camera.setParameters(p1);
camera.takePicture(shutterCallback, rawCallback, jpgCallback);
void setHandler(Handler autoFocusHandler, int autoFocusMessage)
{
this.autoFocusHandler = autoFocusHandler;
this.autoFocusMessage = autoFocusMessage;
}
private AutoFocusCallback autoFocusCallback = new AutoFocusCallback()
{
private Object success;
#Override
public void onAutoFocus(boolean autoFocusSuccess, Camera camera)
{
if (autoFocusHandler != null)
{
Message message = autoFocusHandler.obtainMessage(autoFocusMessage, success);
autoFocusHandler.sendMessageDelayed(message, AUTOFOCUS_INTERVAL_MS);
autoFocusHandler = null;
}
else
{
}
}
};
But the problem is that, this code works fine only for LG phone. and i am getting force close on all other phones after running it.
And the Error Log looks like this
http://textuploader.com/?p=6&id=kOc9G
Not getting where i am going wrong. Please Help! Thanks!

Different phones have different camera params. Check if mode available befire actually setting it.
For example, in your case there is public List<String> getSupportedFocusModes () function of
Camera.Parameters class.
Afaik, cheap phones like acer or zte or some others, have very weak programming support for their cameras.
UPD: code sample
Camera.Parameters p = camera.getParameters();
List<String> modes = p.getSupportedFocusModes();
if(modes.contains(Camera.Parameters.FOCUS_MODE_AUTO))
{
p.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
camera.setParameters(p);
camera.autoFocus(autoFocusCallback);
}
else
{
// this is default focus mode if autofocus unsupported.
// also, we should not call camera.autoFocus(autoFocusCallback) here
p.setFocusMode(Camera.Parameters.FOCUS_MODE_FIXED);
camera.setParameters(p);
}

you are using
Camera.Parameters p = camera.getParameters();
so replace
camera.setParameters(p1);
with
camera.setParameters(p);
I think this should help you....
Camera.Parameters p = camera.getParameters();
List<Size> sizes = p.getSupportedPictureSizes();
// Choose any one you want among sizes
size = sizes.get(0);
p.setPictureSize(size.width, size.height);
camera.setParameters(p);

Don't use "p.setFocusMode(Parameters.FOCUS_MODE_AUTO);" line.
By default focus mode will be FOCUS_MODE_AUTO.

Related

How to Freeze/Lock android Camera X preview with Flash Light updates on taking picture?

I am implementing Camera X. The issue i am facing is to implement a mechanism to lock/freeze camera preview when picture is captured. Currently i have implement a workaround but it doesn't work well if the flash light is on while capturing. I get a frame from previewView (PreviewView) previewView.getBitmap() as before capturing the image and then display in an captureImage (ImageView). But the the freeze frame not show flash light update. My current code is below
private void capturePhoto() {
showProgress(true);
// Get the Information to be used & stored with Image
ContentValues contentValues = getImageSaveInfo();
Uri externalUri = MediaStore.Images.Media.EXTERNAL_CONTENT_URI;
ImageCapture.OutputFileOptions options = new ImageCapture.OutputFileOptions
.Builder(getContentResolver(), externalUri, contentValues)
.build();
// Play the Capture Sound when a picture is captured.
playCameraShutterSound();
// Display current frame From Preview in ImageView.
freezePreview(true);
imageCapture.takePicture(options,
ContextCompat.getMainExecutor(this),
new ImageCapture.OnImageSavedCallback() {
#Override
public void onImageSaved(#NonNull ImageCapture.OutputFileResults results) {
ToastUtility.successToast(getApplicationContext(),
"Photo Capture Successfully");
// Update Last Taken Image View with new Image
getLastTakenImage();
if (results.getSavedUri() != null) {
Log.d(TAG, "Image Saved At -> " + results.getSavedUri().toString());
}
showProgress(false);
freezePreview(false);
}
#Override
public void onError(#NonNull ImageCaptureException exception) {
ToastUtility.errorToast(getApplicationContext(),
"Photo Couldn't Capture");
Log.d(TAG, "Image Capture Error -> " + exception.getMessage());
showProgress(false);
freezePreview(false);
}
});
}
private void freezePreview(boolean value) {
if (value) {
Bitmap bitmap = mainBinding.previewView.getBitmap();
Glide.with(getApplicationContext())
.load(bitmap).into(mainBinding.captureImage);
mainBinding.captureImage.setVisibility(View.VISIBLE);
mainBinding.previewView.setVisibility(View.INVISIBLE);
} else {
mainBinding.previewView.setVisibility(View.VISIBLE);
mainBinding.captureImage.setVisibility(View.INVISIBLE);
}
}
The flash is triggered at some point after takePicture() is called, there isn't a callback for it in CameraX, so there isn't a direct way to know when it's fired.
You can instead use camera2 interop to indirectly check for the flash state. You can add a session CaptureCallback to ImageCapture's config, then inside the callback's onCaptureCompleted, check if the flash state of the total result is FIRED.
// Override onCaptureCompleted to check for the flash state
CameraCaptureSession.CaptureCallback sessionCaptureCallback = //... ;
// Initialize an ImageCapture builder
ImageCapture.Builder configBuilder = new ImageCapture.Builder();
// Add the session CaptureCallback to it
new Camera2Interop.Extender<>(configBuilder)
.setSessionCaptureCallback(sessionCaptureCallback);
// Build the ImageCapture use case
ImageCapture useCase = configBuilder.build();

APP based on camera2 API crash in certain devices due to an Image Reader surface

I am having one of these mysterious android' issues (at least from my point of view). My app manages the device's camera via the camera2 api library. In my case I have two surfaces, one of them coming from an Image Reader object. Next I define my capture session and set those surfaces as targets. As you can see in the code below, I am following the typical workflows in such cases:
// Create ImageReader Surface
int max = 2;
mReader = ImageReader.newInstance(mWidth, mHeight, ImageFormat.YV12, max);
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
#Override
public void onImageAvailable(ImageReader mReader) {
Image image = null;
image = mReader.acquireLatestImage();
if (image == null) {
return;
}
byte[] bytes = convertYUV420ToNV21(image);
nativeVideoFrame(bytes);
image.close();
}
};
if (OPENGL_SOURCE==2){
nativeVideoInit(mWidth, mHeight, 0, false);
}
mReader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
// Create Texture Surface
texture = createTexture();
mSurfaceTexture = new SurfaceTexture(texture);
mSurfaceTexture.setOnFrameAvailableListener(this);
mSurfaceTexture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
mSurface = new Surface(mSurfaceTexture);
//Attach surfaces to CaptureRequest
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(mReader.getSurface());
outputSurfaces.add(mSurface);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(mSurface);
captureRequestBuilder.addTarget(mReader.getSurface());
//Define the capture request
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback(){
#Override
public void onConfigured(#NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
#Override
public void onConfigureFailed(#NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(MainActivity.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
The thing is that I am not having any problem running this code on my tablet Samsung TAB A. However, when trying in my Nexus 5X or my friend's Samsung S6, the app crashes dramatically throwing this error:
08-23 11:28:51.772: E/AndroidRuntime(20315): FATAL EXCEPTION: main
08-23 11:28:51.772: E/AndroidRuntime(20315): Process: com.example.opengltest, PID: 20315
08-23 11:28:51.772: E/AndroidRuntime(20315): java.lang.IllegalArgumentException: Bad argument passed to camera service
08-23 11:28:51.772: E/AndroidRuntime(20315): at android.hardware.camera2.utils.CameraBinderDecorator.throwOnError(CameraBinderDecorator.java:114)
Doing some test, I found that the problem comes from the image reader surface. If I get rid off on this surface from the capture session's settings, the code runs seamlessly.
Why is this happening only on my Nexus 5x or Samsung S6 and not in my tablet? And how can I fix it?
Thanks,
JM
If you look at the whole system logcat, the camera service should have a
more detailed line that states why your output surface set is bad.
However, YV12 is not a format that's guaranteed to be supported, so it's probably just that. Some devices support it, some don't.
The only YUV format that's guaranteed to be supported by all devices is ImageFormat.YUV_420_888.
If you want to use YV12 when possible, you'll need to check the StreamConfigurationMap.getOutputFormats() list to see if it's listed before trying to use it. But you'll still need to fall back to YUV_420_888 on many devices, so it's simplest to just support that directly and nothing else.

Is it possible to programmatically take photo with native Camera app on Android device without physically tapping the capture button?

Just as the title say, is it possible to programmatically interact/configure with native Camera app on Android device?
For example, I am trying to have the native camera app to take a picture without having to physically touch the capture button. I'm not wanting to use internal-timer feature in the camera app since this requires physical touch to the button as well.
So simply put, I want to know if it is possible to take a picture without physically touching the capture button.
Thanks in advance for the reply.
is it possible to programmatically interact/configure with native Camera app on Android device?
Not from an ordinary Android app.
Also, please note that there are several thousand Android device models. There are hundreds of different "native Camera app" implementations across those device models, as device manufacturers often implement their own. Your question implies that you think that there is a single "native Camera app", which is not the case.
For an individual device model, or perhaps a closely related family of devices, with a rooted device, you might be able to work something out with simulated user input. However, the myriad of camera apps means that you would need different rules for each app.
Also, if you are only concerned about your own device, you can use the uiautomator test system to control third-party apps, but that requires a connection to your development machine, as the tests are run from there.
It is possible. Just forget about "native Camera app" and use Camera/Camera2 API directly.
Some time ago I tried to make a background service taking pictire periodically, detects face and measure eyes distance to prevent my little dougter watching tab too close, but this was fail because tab camera angle was too narrow to take all her face.
I posted part of this app here (this code use depricated Camera interface. It was replaced by Camera2 interface since API21):
public void onCreate() {
super.onCreate();
mContext = getApplicationContext();
surfaceTexture = new SurfaceTexture(0);
}
public void takePictire() {
Camera cam = openFrontCamera(mContext);
if (cam != null) {
try {
cam.setPreviewTexture(surfaceTexture);
cam.startPreview();
cam.takePicture(null, null, mPicture);
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't take picture!");
}
}
}
private static Camera.PictureCallback mPicture = new Camera.PictureCallback() {
#Override
public void onPictureTaken(byte[] data, Camera camera) {
BitmapFactory.Options bfo = new BitmapFactory.Options();
bfo.inPreferredConfig = Bitmap.Config.RGB_565;
Bitmap bitmap = BitmapFactory.decodeStream(new ByteArrayInputStream(data), null, bfo);
// Eye distance detection here and saving data
camera.stopPreview();
camera.release();
}
};
/* Check if this device has a camera */
private static Camera openFrontCamera(Context context) {
try {
boolean hasCamera = context.getPackageManager().hasSystemFeature(PackageManager.FEATURE_CAMERA);
if (hasCamera) {
int cameraCount = 0;
Camera cam = null;
Camera.CameraInfo cameraInfo = new Camera.CameraInfo();
cameraCount = Camera.getNumberOfCameras();
for (int camIdx = 0; camIdx < cameraCount; camIdx++) {
Camera.getCameraInfo(camIdx, cameraInfo);
if (cameraInfo.facing == Camera.CameraInfo.CAMERA_FACING_FRONT) {
try {
cam = Camera.open(camIdx);
} catch (RuntimeException e) {
Log.e(LOG_TAG, "Camera failed to open: " + e.getLocalizedMessage());
}
}
}
return cam;
}
} catch (Exception ex) {
Log.d(LOG_TAG, "Can't open front camera");
}
return null;
}
Some additional info. To use this code you app should have camera permission in AndroidManifest.xml:
<uses-permission android:name="android.permission.CAMERA" />
Yes, You can capture image without any user input also in background without frame. Take a look here . Hope this help you!

onFaceDetection called only once or twice while running but works perfectly when debugging with breakpoints

This is the code I am using for face detection, the problem is when I debug this code with android studio the onFaceDetection method is called multiple times and face is detected perfectly(When i put a break point inside the method). But when I run it without any break points the method is called only 2-3 times and face detection doesn't take place. Any help regarding this would be much appreciated, as you can see from the code I've tried stopping and starting face detection.
void setFaceDetectionListener() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.ICE_CREAM_SANDWICH) {
mFaceDetectionListener = new Camera.FaceDetectionListener() {
Handler faceDetectionHandler;
#Override
public void onFaceDetection(final Camera.Face[] faces, final Camera camera) {
if(faceDetectionHandler == null){//Initialize
faceDetectionHandler = new Handler();
Toast.makeText(HWTestActivity.this,
UiMessages.MSG_SHOW_YOUR_FACE.toString(),
Toast.LENGTH_SHORT).show();
}
faceDetectionHandler.post(new Runnable() {
#Override
public void run() {
Log.e("faceDetect", "No of faces = " + faces.length);
if (!is_face_detected) {
Toast.makeText(HWTestActivity.this,
UiMessages.MSG_DETECTING_YOUR_FACE.toString(),
Toast.LENGTH_SHORT).show();
is_face_detected = faces.length > 0;
}
if (faces.length > 0) {
Toast.makeText(HWTestActivity.this,
UiMessages.MSG_FACE_DETECTED.toString(),
Toast.LENGTH_SHORT).show();
camera.stopFaceDetection();
} else {
camera.stopFaceDetection();
camera.startFaceDetection();
}
}
});
}
};
}
}
This was ignorance on my part, apparently you can't have face detection running while the media recorder is running, So guys don't try to run face detection while you are recording with the camera simultaneously.
If you really wanted to detect faces while recording then you should use the
onPreviewFrame(byte[] pixelData, Camera camera)
method in
Camera.PreviewCallback()
convert the pixelData to RGB_565 bitmap and supply it to the FaceDetector.findfaces method. But in my experience I find this method to be very unreliable.

Enabling Camera Flash While Recording Video

I need a way to control the camera flash on an Android device while it is recording video. I'm making a strobe light app, and taking videos with a flashing strobe light would result in the ability to record objects that are moving at high speeds, like a fan blade.
The flash can only be enabled by starting a video preview and setting FLASH_MODE_TORCH in the camera's parameters. That would look like this:
Camera c = Camera.open();
Camera.Parameters p = c.getParameters();
p.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
c.setParameters(p);
c.startPreview();
Once the preview has started, I can flip that parameter back and forth to turn the light on and off. This works well until I try to record a video. The trouble is that in order to give the camera to the MediaRecorder, I first have to unlock it.
MediaRecorder m = new MediaRecorder();
c.unlock(); // the killer
m.setCamera(c);
After that unlock, I can no longer change the camera parameters and therefore have no way to change the flash state.
I do not know if it is actually possible to do this since I'm not the best at java-hacking, but here is what I do know:
Camera.unlock() is a native method, so I can't really see the mechanism behind the way it locks me out
Camera.Parameter has a HashMap that contains all of its parameters
Camera.setParameters(Parameters) takes the HashMap, converts it to a string, and passes it to a native method
I can eliminate all the parameters but TORCH-MODE from the HashMap and the Camera will still accept it
So, I can still access the Camera, but it won't listen to anything I tell it. (Which is kind of the purpose of Camera.unlock())
Edit:
After examining the native code, I can see that in CameraService.cpp my calls to Camera.setParameters(Parameters) get rejected because my Process ID does not match the Process ID the camera service has on record. So it would appear that that is my hurdle.
Edit2:
It would appear that the MediaPlayerService is the primary service that takes control of the camera when a video is recording. I do not know if it is possible, but if I could somehow start that service in my own process, I should be able to skip the Camera.unlock() call.
Edit3:
One last option would be if I could somehow get a pointer to the CameraHardwareInterface. From the looks of it, this is a device specific interface and probably does not include the PID checks. The main problem with this though is that the only place that I can find a pointer to it is in CameraService, and CameraService isn't talking.
Edit4: (several months later)
At this point, I don't think it is possible to do what I originally wanted. I don't want to delete the question on the off chance that someone does answer it, but I'm not actively seeking an answer. (Though, receiving a valid answer would be awesome.)
I encountered a similar issue. The user should be able to change the flash mode during recording to meet their needs depending on the light situation. After some investigative research i came to the following solution:
I assume, that you've already set up a proper SurfaceView and a SurfaceHolder with its necessary callbacks. The first thing i did was providing this code (not declared variables are globals):
public void surfaceCreated(SurfaceHolder holder) {
try {
camera = Camera.open();
parameters = camera.getParameters();
parameters.setFlashMode(Parameters.FLASH_MODE_OFF);
camera.setParameters(parameters);
camera.setPreviewDisplay(holder);
camera.startPreview();
recorder = new MediaRecorder();
} catch (IOException e) {
e.printStackTrace();
}
}
My next step was initializing and preparing the recorder:
private void initialize() {
camera.unlock();
recorder.setCamera(camera);
recorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
recorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);
recorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264);
recorder.setVideoFrameRate(20);
recorder.setOutputFile(filePath);
try {
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
finish();
} catch (IOException e) {
e.printStackTrace();
finish();
}
}
It's important to note, that camera.unlock() has to be called BEFORE the whole initialization process of the media recorder. That said also be aware of the proper order of each set property, otherwise you'll get an IllegalStateException when calling prepare() or start(). When it comes to recording, i do this. This will usually be triggered by a view element:
public void record(View view) {
if (recording) {
recorder.stop();
//TODO: do stuff....
recording = false;
} else {
recording = true;
initialize();
recorder.start();
}
}
So now, i finally can record properly. But what's with that flash? Last but not least, here comes the magic behind the scenes:
public void flash(View view) {
if(!recording) {
camera.lock();
}
parameters.setFlashMode(parameters.getFlashMode().equals(Parameters.FLASH_MODE_TORCH) ? Parameters.FLASH_MODE_OFF : Parameters.FLASH_MODE_TORCH);
camera.setParameters(parameters);
if(!recording) {
camera.unlock();
}
}
Everytime i call that method via an onClick action i can change the flash mode, even during recording. Just take care of properly locking the camera. Once the lock is aquired by the media recorder during recording, you don't have to lock/unlock the camera again. It doesn't even work. This was tested on a Samsung Galaxy S3 with Android-Version 4.1.2. Hope this approach helps.
After preparing media recorder, use camera.lock(), and then set whatever parameters you want to set to camera.
But before starting recording you need to call camera.unlock(), and after you stop media recorder you need to call camera.lock() to start preview.
Enjoy!!!
Try this.. hopefully it will work.. :)
private static Torch torch;
public Torch() {
super();
torch = this;
}
public static Torch getTorch() {
return torch;
}
private void getCamera() {
if (mCamera == null) {
try {
mCamera = Camera.open();
} catch (RuntimeException e) {
Log.e(TAG, "Camera.open() failed: " + e.getMessage());
}
}
}
public void toggleLight(View view) {
toggleLight();
}
private void toggleLight() {
if (lightOn) {
turnLightOff();
} else {
turnLightOn();
}
}
private void turnLightOn() {
if (!eulaAgreed) {
return;
}
if (mCamera == null) {
Toast.makeText(this, "Camera not found", Toast.LENGTH_LONG);
button.setBackgroundColor(COLOR_WHITE);
return;
}
lightOn = true;
Parameters parameters = mCamera.getParameters();
if (parameters == null) {
button.setBackgroundColor(COLOR_WHITE);
return;
}
List<String> flashModes = parameters.getSupportedFlashModes();
if (flashModes == null) {
button.setBackgroundColor(COLOR_WHITE);
return;
}
String flashMode = parameters.getFlashMode();
Log.i(TAG, "Flash mode: " + flashMode);
Log.i(TAG, "Flash modes: " + flashModes);
if (!Parameters.FLASH_MODE_TORCH.equals(flashMode)) {
if (flashModes.contains(Parameters.FLASH_MODE_TORCH)) {
parameters.setFlashMode(Parameters.FLASH_MODE_TORCH);
mCamera.setParameters(parameters);
button.setBackgroundColor(COLOR_LIGHT);
startWakeLock();
} else {
Toast.makeText(this, "Flash mode (torch) not supported",
Toast.LENGTH_LONG);
button.setBackgroundColor(COLOR_WHITE);
Log.e(TAG, "FLASH_MODE_TORCH not supported");
}
}
}
private void turnLightOff() {
if (lightOn) {
button.setBackgroundColor(COLOR_DARK);
lightOn = false;
if (mCamera == null) {
return;
}
Parameters parameters = mCamera.getParameters();
if (parameters == null) {
return;
}
List<String> flashModes = parameters.getSupportedFlashModes();
String flashMode = parameters.getFlashMode();
if (flashModes == null) {
return;
}
Log.i(TAG, "Flash mode: " + flashMode);
Log.i(TAG, "Flash modes: " + flashModes);
if (!Parameters.FLASH_MODE_OFF.equals(flashMode)) {
if (flashModes.contains(Parameters.FLASH_MODE_OFF)) {
parameters.setFlashMode(Parameters.FLASH_MODE_OFF);
mCamera.setParameters(parameters);
stopWakeLock();
} else {
Log.e(TAG, "FLASH_MODE_OFF not supported");
}
}
}
}
private void startPreview() {
if (!previewOn && mCamera != null) {
mCamera.startPreview();
previewOn = true;
}
}
private void stopPreview() {
if (previewOn && mCamera != null) {
mCamera.stopPreview();
previewOn = false;
}
}
private void startWakeLock() {
if (wakeLock == null) {
Log.d(TAG, "wakeLock is null, getting a new WakeLock");
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
Log.d(TAG, "PowerManager acquired");
wakeLock = pm.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK, WAKE_LOCK_TAG);
Log.d(TAG, "WakeLock set");
}
wakeLock.acquire();
Log.d(TAG, "WakeLock acquired");
}
private void stopWakeLock() {
if (wakeLock != null) {
wakeLock.release();
Log.d(TAG, "WakeLock released");
}
}
#Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (Eula.show(this)) {
eulaAgreed = true;
}
setContentView(R.layout.main);
button = findViewById(R.id.button);
surfaceView = (SurfaceView) this.findViewById(R.id.surfaceview);
surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(this);
surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
disablePhoneSleep();
Log.i(TAG, "onCreate");
}
To access the device camera, you must declare the CAMERA permission in your Android Manifest. Also be sure to include the <uses-feature> manifest element to declare camera features used by your application. For example, if you use the camera and auto-focus feature, your Manifest should include the following:
<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
A sample that checks for torch support might look something like this:
//Create camera and parameter objects
private Camera mCamera;
private Camera.Parameters mParameters;
private boolean mbTorchEnabled = false;
//... later in a click handler or other location, assuming that the mCamera object has already been instantiated with Camera.open()
mParameters = mCamera.getParameters();
//Get supported flash modes
List flashModes = mParameters.getSupportedFlashModes ();
//Make sure that torch mode is supported
//EDIT - wrong and dangerous to check for torch support this way
//if(flashModes != null && flashModes.contains("torch")){
if(flashModes != null && flashModes.contains(Camera.Parameters.FLASH_MODE_TORCH)){
if(mbTorchEnabled){
//Set the flash parameter to off
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_OFF);
}
else{
//Set the flash parameter to use the torch
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
}
//Commit the camera parameters
mCamera.setParameters(mParameters);
mbTorchEnabled = !mbTorchEnabled;
}
To turn the torch on, you simply set the camera parameter Camera.Parameters.FLASH_MODE_TORCH
Camera mCamera;
Camera.Parameters mParameters;
//Get a reference to the camera/parameters
mCamera = Camera.open();
mParameters = mCamera.getParameters();
//Set the torch parameter
mParameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
//Comit camera parameters
mCamera.setParameters(mParameters);
To turn the torch off, set Camera.Parameters.FLASH_MODE_OFF

Categories

Resources