android camera click continous shots - java

I am trying to make camera application which takes 3 continuous shots.
i have tried to call "takePicture" several times by putting it in a loop.
but no success.
please help on this matter.
a little help will be appreciated.

You never should call PictureCallback.onPictureTaken() from your code; this callback receives data from the system when it is ready, as response to Camera.takePicture().
The latter call will only succeed if the camera is opened and preview is started. Therefore, simply calling Camera.takePicture() in a loop will not work (see e.g. Android 2.3.1 Camera takePicture() Multiple images with one button click). The correct way to handle this is to keep a counter of shots processed in your onPictureTaken(), and if it is less than 3, then restart camera preview and issue (synchroneously) another Camera.takePicture(). After this, onPictureTaken() should return, to allow processing of the next captured frame.

I use it like this when doing a PhotoBurst. It is also handling the FRameLayout holding the preview to start the PhotoBurst:
PictureCallback jpegCallback = new PictureCallback() {
public void onPictureTaken(byte[] data, Camera camera) {
FileOutputStream outStream = null;
try {
Parameters param = camera.getParameters();
param.setPictureSize(640, 480);
camera.setParameters(param);
// Or write to sdcard
outStream = new FileOutputStream(String.format(
Environment.getExternalStorageDirectory().getPath()+"/foto%d.jpg",
System.currentTimeMillis()));
outStream.write(data);
outStream.close();
sendBroadcast(new Intent(Intent.ACTION_MEDIA_MOUNTED,
Uri.fromFile(Environment.getExternalStorageDirectory())));
Log.i(TAG, "onPictureTaken - wrote bytes: " + data.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
}
Log.d(TAG, "onPictureTaken - jpg");
try {
stillCount++;
camera.startPreview();
if (stillCount < 10) {
preview.mCamera.takePicture(shutterCallback, rawCallback,
jpegCallback);
if (stillCount == 9) {
frameLayout.setClickable(true);
}
} else {
stillCount = 0;
takePictureButton.setEnabled(true);
frameLayout.setClickable(true);
}
} catch (Exception e) {
Log.d(TAG, "Error starting preview: " + e.toString());
}
}
};

I got the solution.
i was calling mCamera.startPreview(); out of my loop.
preview is must to take shots, and not including mCamera.startPreview(); was blocking my execution.

Related

How do I get the flashlight on my phone to flash accurately?

I am currently coding an Android application on android studio with Java. This one is simple, when the button is clicked, a series of flashing of the phone's torch is launched. I need each flash to be regulated, and last the same time, but it's not the case, each flash does not last the same.
For this code, for example, we have 150 loops of 400 milliseconds, so we should have 150*0.4=60 seconds of operation, but by timing, I have about 65 seconds, certainly because of the inprecision of the duration of each flash.
I wanted to know if anyone has an idea, how to solve this problem,
Thanks in advance.
private void switchOn(){ // Function that is called each time the user presses the button (Onclick )
for (int i=0; i<150; i++){
flash(); // It calls the function that turns the torch on or off
try {
Thread.sleep(200); // process that puts a delay and allows the torch to work 200 milliseconds
} catch (InterruptedException e) {
e.printStackTrace();
}
flash();
try {
Thread.sleep(200);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
public void flash(){ // function that turns the torch on or off according to its state
try {
CameraManager cameraManager = (CameraManager) activity.getSystemService(Context.CAMERA_SERVICE);
for (String id : cameraManager.getCameraIdList()) {
if (cameraManager.getCameraCharacteristics(id)
.get(CameraCharacteristics.FLASH_INFO_AVAILABLE)) {
cameraManager.setTorchMode(id, !flashState);
flashState = !flashState;
}
}
} catch (CameraAccessException e) {
e.printStackTrace();
}
}

Outgoing call recording audio file has no sound

After recording an outgoing phone call, I am trying to play the recorded file - to make sure the call recording worked as expected (I am doing it using 'MediaPlayer'), but there is no sound.
So I tried to access the actual file on the phone (simply attached the phone to the computer and accessed it's files). When I played it the recording it was in the right length but again no sound.
What am I missing?
This is how I record the phone call:
MediaRecorder recorder = new MediaRecorder();
recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_COMMUNICATION);
recorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
// recorder.setOutputFormat(MediaRecorder.OutputFormat.AMR_NB);
recorder.setAudioEncoder(MediaRecorder.AudioEncoder.AMR_NB);
File callAudioFile = null;
try {
File downloadsDir = context.getApplicationContext().getExternalFilesDir(Environment.DIRECTORY_DOWNLOADS);
callAudioFile = File.createTempFile("deTuCelRecord", ".amr", downloadsDir);
} catch (IOException e) {
e.printStackTrace();
}
assert callAudioFile != null;
audioFilePath = callAudioFile.getAbsolutePath();
recorder.setOutputFile(audioFilePath);
try {
recorder.setOnErrorListener(new MediaRecorder.OnErrorListener() {
#Override
public void onError(MediaRecorder mr, int what, int extra) {
Log.e("MediaRecorder", "MediaRecorder error " + what + " " + extra);
}
});
recorder.prepare();
} catch (IllegalStateException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
recorder.start();
This is the code which ends the call recording:
recorder.stop();
recorder.release();
This is how I play the audio file:
MediaPlayer mPlayer = new MediaPlayer();
try {
mPlayer.setDataSource(audioFilePath);
mPlayer.prepare();
Toast.makeText(getApplicationContext(), "PLAYING AUDIO", Toast.LENGTH_LONG).show();
mPlayer.start();
Log.d("PLAY_AUDIO", "Started playing audio");
} catch (IOException e) {
Log.e("PLAY_AUDIO", "Failed to play audio");
}
Please check this Accessibilty Service in your testing phone.
If you are trying to record call on Android Q.
Please refer this link
You can try
recorder.setAudioSource(MediaRecorder.AudioSource.VOICE_CALL); It need Manifest.permission.CAPTURE_AUDIO_OUTPUT permission.
Please check AudioSource Source documentation for difference between VOICE_CALL and VOICE_COMMUNICATION
The issue was the android version - from android 10 it didn't allow me to record the call but on android 9 it did.

Can not play music from http server for over 3 seconds

I am trying to make a music streaming app. For the music I use a URL, and for some reason, the music is playing for 3 seconds and then automatically stops. pls help me fix it.
I also get a message in the Run says: "MediaPlayer finalized without being released".
Thanks.
the code:
String url = "https://radio.streamgates.net/stream/1036kh"; // your URL here
MediaPlayer mediaPlayer = new MediaPlayer();
mediaPlayer.setAudioStreamType(AudioManager.STREAM_VOICE_CALL);
try {
mediaPlayer.setDataSource(url);
} catch (IOException e) {
Toast.makeText(this,"failed", Toast.LENGTH_LONG).show();
e.printStackTrace();
}
try {
mediaPlayer.prepare(); // might take long! (for buffering, etc)
} catch (IOException e) {
Toast.makeText(this,"failed", Toast.LENGTH_LONG).show();
e.printStackTrace();
}
mediaPlayer.start();

How to pass a Bitmap from an Activity to an other in a fast way?

I'm trying to pass a Bitmap from an Activity to an other, I tried multiple solutions but they are not fast enough.
Currently I'm facing this problem: When I click the next button it freezes for 2 seconds then move to the next Activity with the right Bitmap shown in the ImageView.
I found this solution in StackoverFlow. Here is the code:
Uri imageUri = intent.getParcelableExtra("URI");
if (imageUri != null) {
imageView.setImageURI(imageUri);
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
} catch (IOException e) {
e.printStackTrace();
}
} else {
Toast.makeText(this, "No image is set to show", Toast.LENGTH_LONG).show();
}
btn_next_process.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if (bitmap == null) {
Toast.makeText(CropResultActivity.this, "Emptyyy", Toast.LENGTH_SHORT).show();
}
else {
try {
//Write file
String filename = "bitmap.png";
FileOutputStream stream = CropResultActivity.this.openFileOutput(filename, Context.MODE_PRIVATE);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
//Cleanup
stream.close();
// bitmap.recycle();
//Pop intent
Intent in1 = new Intent(CropResultActivity.this, InputProcessingActivity.class);
in1.putExtra("image_data", filename);
startActivity(in1);
} catch (Exception e) {
e.printStackTrace();
}
}
}
});
Then I tried to save the file in a worker Thread first, and when I click the next button I retrieve it, now it's working fast but I am getting a wrong Bitmap.
Here is the code :
Uri imageUri = intent.getParcelableExtra("URI");
if (imageUri != null) {
imageView.setImageURI(imageUri);
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
new Thread() {
#Override
public void run() {
try {
//Write file
FileOutputStream stream = CropResultActivity.this.openFileOutput(filename, Context.MODE_PRIVATE);
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
//Cleanup
stream.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}.run();
} catch (IOException e) {
e.printStackTrace();
}
} else {
Toast.makeText(this, "No image is set to show", Toast.LENGTH_LONG).show();
}
btn_next_process.setOnClickListener(new View.OnClickListener() {
#Override
public void onClick(View view) {
if (bitmap == null) {
Toast.makeText(CropResultActivity.this, "Empty", Toast.LENGTH_SHORT).show();
}
else {
//Pop intent
Intent in1 = new Intent(CropResultActivity.this, InputProcessingActivity.class);
in1.putExtra("image_data", filename);
startActivity(in1);
}
}
});
In the second Activity I retrieve the Bitmap this way :
private void getIncomingIntent(){
if(getIntent().hasExtra("image_data")){
try {
String filename = getIntent().getStringExtra("image_data");
FileInputStream is = this.openFileInput(filename);
imageToProcess = BitmapFactory.decodeStream(is);
process_detect_edges(imageToProcess);
is.close();
} catch (Exception e) {
e.printStackTrace();
}
}
}
What am I doing wrong?
Just pass the image uri to next activity. Load from the uri in other activity.
Create a trivial Service. Give that Service a public Bitmap mBitmap member.
Keep each Activity bound to the Service while they are between onStart() and onStop().
If your Activities have a reference to the Service, they can communicate directly via the mBitmap member. One Activity can set mBitmap, then start the other Activity. The second Activity can simply grab the reference (after binding, of course), and begin manipulating the Bitmap. Since everything happens on the UI thread, there are no synchronization concerns. And everything is quite fast.
This solution does not address problems of persistence: If the user leaves the app for a period of time (puts it the background, locks the screen, etc.), then the entire app may be destroyed, and mBitmap would be lost. However, if you're just trying to share data between two successive Activities, this is a straightforward way of doing it.
You could even share the Bitmap via a public static reference, placed in any convenient class. There are rumors that the garbage collector goes around setting static references to null at a whim, but this is a misinterpretation of the actual behavior: That an entire app may get cleaned up at an unexpected time. When you return to your Activity, the system may actually have to restart the app and recreate the Activity from scratch. In this case, the reference would be reset to null.
Instead, using a Service indicates to the OS that you have a component of your app that should be a little bit longer-lived. Certainly, it will continue to exist across the gap between two successive Activities.
Note that, on Oreo and later, the system can be quite aggressive about cleaning up apps as soon as they leave the foreground.

Trouble with Android Camera

I have some code I have been experimenting with to see what I can do with the camera device. This following code works, but I have some issues with it that I cannot seem to solve.
The first call never works. The first time running the code the onPictureTaken callback is never called, so the file is never written. However the camera goes through all the other steps, including making the fake shutter noise.
I can't seem to set the picture size to something other than whatever it defaults to. If I try to set it to something else, the code stops working. Does the same as above, where the camera goes through all the motions, but the onPictureTaken callback is never called.
When the pictures are saved to the DCIM folder, the taken pictures do not show up in the default 'Photos' app on my phone, unless i reboot the phone.
Is there any way through code to disable the shutter noise?
Sorry, the code is a little messy because its an experiment.
Also, this code is executed in a BroadcastReceiver
#Override
public void onReceive(Context context, Intent intent) {
// TODO Auto-generated method stub
if(intent.getAction().equals(TAKE_PICTURE_INTENT))
{
Toast.makeText(context, "Test", Toast.LENGTH_LONG).show();
System.out.println("GOT THE INTENT");
try
{
Camera camera = Camera.open();
System.out.println("CAMERA OPENED");
Parameters params = camera.getParameters();
params.set("flash-mode", "off");
params.set("focus-mode", "infinity");
params.set("jpeg-quality", "100");
//params.setPictureSize(2592, 1952);
String str = params.get("picture-size" + "-values");
System.out.println(str);
String size = str.split(",")[0];
System.out.println(size);
//params.set("picture-size", size);
camera.setParameters(params);
System.out.println("CAMERA PARAMETERS SET");
camera.startPreview();
System.out.println("CAMERA PREVIEW STARTED");
camera.autoFocus(new AutoFocusCallBackImpl());
}
catch(Exception ex)
{
System.out.println("CAMERA FAIL, SKIP");
return ;
}
}//if
}//onreceive
private void TakePicture(Camera camera)
{
camera.takePicture(new Camera.ShutterCallback() {
#Override
public void onShutter() {
// TODO Auto-generated method stub
System.out.println("CAMERA SHUTTER CALLBACK");
}
}
, null,
new Camera.PictureCallback() {
public void onPictureTaken(byte[] imageData, Camera c) {
//c.release();
System.out.println("CAMERA CALLBACK");
FileOutputStream outStream = null;
try {
System.out.println("Start Callback");
File esd = Environment.getExternalStorageDirectory();
outStream = new FileOutputStream(esd.getAbsolutePath() + String.format(
"/DCIM/%d.jpg", System.currentTimeMillis()));
outStream.write(imageData);
outStream.close();
System.out.println( "onPictureTaken - wrote bytes: " + imageData.length);
} catch (FileNotFoundException e) {
e.printStackTrace();
System.out.println("File not found exception");
} catch (IOException e) {
e.printStackTrace();
System.out.println("IO exception");
} finally {
System.out.println("Finally");
c.release();
}
}
}
);
//camera.release();
}//TAKE PICTURE
private class AutoFocusCallBackImpl implements Camera.AutoFocusCallback {
#Override
public void onAutoFocus(boolean success, Camera camera) {
//bIsAutoFocused = success; //update the flag used in onKeyDown()
System.out.println("Inside autofocus callback. autofocused="+success);
//play the autofocus sound
//MediaPlayer.create(CameraActivity.this, R.raw.auto_focus).start();
if(success)
{
System.out.println("AUTO FOCUS SUCCEDED");
}
else
{
System.out.println("AUTO FOCUS FAILED");
}
TakePicture(camera);
System.out.println("CALLED TAKE PICTURE");
}
}//AUTOFOCUSCALLBACK
1.First of all put all camera logic out of BroadCast receiver & put it into seprate Activity.
2.
When the pictures are saved to the DCIM folder, the taken pictures do not show up in the default 'Photos' app on my phone, unless i reboot the phone.
because MediaScanner needs to be called to rescan images/changes once you take photo. When u reboot phone mediascanner scans media & finds new images. for this isuue you should check out MediaScanner.
3.Follow Android Camera Tutorial & Camera API
-Thanks

Categories

Resources