Android Camera2 Preview Image Decoding fails on car head unit - java

I am pretty new to Camera2 API and currenty trying to implement a camera preview into my own car application on an Android head unit (Model YT9213AJ). The preview should show the image of the reverse camera.
I've tested the following code on a Samsung Tablet (SM-P610) and it shows the camera preview images as expected, from both, rear and front camera.
private void bindPreview(#NonNull ProcessCameraProvider cameraProvider) {
CameraSelector cameraSelector = new CameraSelector.Builder()
.requireLensFacing(CameraSelector.LENS_FACING_FRONT)
.build();
CameraManager manager = (CameraManager) context.getSystemService(CAMERA_SERVICE);
Size previewSize = getPreviewSize(manager.getCameraCharacteristics("1"));
Preview preview = new Preview.Builder()
.setTargetResolution(previewSize)
.setDefaultResolution(previewSize)
.setMaxResolution(previewSize)
.setTargetRotation(Surface.ROTATION_270)
.build();
preview.setSurfaceProvider(previewView.getSurfaceProvider());
Camera camera = cameraProvider.bindToLifecycle((LifecycleOwner)this, cameraSelector, preview);
}
With this function to get the preview size:
Size getPreviewSize(CameraCharacteristics characteristics) {
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
Size[] previewSizes = map.getOutputSizes(SurfaceTexture.class);
return previewSizes[0]; // for the camera just one resolution is given, so this should be sufficient
}
This is what the image looks like, if I run it on the car head unit:
Example image preview from car head unit
(Sorry, can't embed images into post yet)
I've also run the App "Camera2 API probe", please find the results here on AirBeat. The camera with ID 5 seems to be an placeholder, I assume that the other two cameras (ID 0 and ID 1) are the representations of the two hardware inputs into the head unit.
Do you have any clue how I can correctly decode the image for this camera model? Thanks for your time.

You could try seeing if not rotating the output helps - the size listed by the camera is already portrait aspect ratio.
In general, though, it looks like whoever made this head unit didn't take a lot of care in making sure the camera output works with the display side, I'm guessing. So it may not be possible to get it to work with the CameraX PreviewView as-is. The PreviewView tries to select the best kind of output View for the API level that the device is, but it may be picking a path the manufacturer didn't actually test with. There's no built-in way to have it select the other option (SurfaceView or TextureView are the choices), so you'd have to modify PreviewView code or build your own Preview surface provider.
There's also no guarantee that'll help, depending on exactly how badly this device implements things.

Related

Camera VIew OpenCV in Android Cannot FIt

I have a problem with cameraview opencv on android, on android device type samsung camera view doesn't fit there is a black cut like picture 1 while on android device type xiaomi and realme it's safe like picture 2. I took the middle resolution from supportPreviewSize and set the maxFrameSize to a ratio of 1:1, how can the camera view size be compatible with all current android devices? is this purely because my code is still wrong or is it the camera settings of the android device itself?
Picture 1 Device Samsung A51
Picture 2 Device Realme 3
setResolution() :
mCamera = android.hardware.Camera.open();Camera.Parameters params = mCamera.getParameters();
List<Camera.Size> listSizes = params.getSupportedPreviewSizes();
List<Camera.Size> listCapture = params.getSupportedPictureSizes();
int midResolution = listSizes.size() / 2;
cameraSize = listSizes.get(midResolution);
params.setPictureSize(cameraSize.width, cameraSize.height);
params.setVideoStabilization(true);
params.setPreviewSize(cameraSize.width, cameraSize.height);
mCamera.setParameters(params);
mCamera.startPreview();
setMaxFrameSize :
jCameraView.setResolution();
Camera.Size sizeMaxFrame = jCameraView.getSizeCamera();
jCameraView.setMaxFrameSize(sizeMaxFrame.height, sizeMaxFrame.height);
I set the maxFrameSize value to the same value using the height value from the setResolution() method
Sorry if my language or question is not easy to understand
This is because of the Camera2 framework which you are using . It is really a pain to work with that frameWork and provide support to all devices. To cope up with this issue the Android Team came up with new library for the same , which is CameraX .
The documentation for the same are here :
https://developer.android.com/training/camerax
It provides rich support to most of the devices. And has lot of extra features which are known as Vendor Extensions , you will find more on it at the above link . This framework is built on top of Camera / Camera2 framework . So you should consider migrating to CameraX for better support to more devices , using the same code . Or you need to cater individual set of devices using the current framework .

Android Java - programmatically capture background before app window opens [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How to programatically take a screenshot on Android?
How to capture the android device screen content and make an image file using the snapshot data? Which API should I use or where could I find related resources?
BTW:
not camera snapshot, but device screen
Use the following code:
Bitmap bitmap;
View v1 = MyView.getRootView();
v1.setDrawingCacheEnabled(true);
bitmap = Bitmap.createBitmap(v1.getDrawingCache());
v1.setDrawingCacheEnabled(false);
Here MyView is the View through which we need include in the screen. You can also get DrawingCache from of any View this way (without getRootView()).
There is also another way.. If we having ScrollView as root view then its better to use following code,
LayoutInflater inflater = (LayoutInflater) this.getSystemService(LAYOUT_INFLATER_SERVICE);
FrameLayout root = (FrameLayout) inflater.inflate(R.layout.activity_main, null); // activity_main is UI(xml) file we used in our Activity class. FrameLayout is root view of my UI(xml) file.
root.setDrawingCacheEnabled(true);
Bitmap bitmap = getBitmapFromView(this.getWindow().findViewById(R.id.frameLayout)); // here give id of our root layout (here its my FrameLayout's id)
root.setDrawingCacheEnabled(false);
Here is the getBitmapFromView() method
public static Bitmap getBitmapFromView(View view) {
//Define a bitmap with the same size as the view
Bitmap returnedBitmap = Bitmap.createBitmap(view.getWidth(), view.getHeight(),Bitmap.Config.ARGB_8888);
//Bind a canvas to it
Canvas canvas = new Canvas(returnedBitmap);
//Get the view's background
Drawable bgDrawable =view.getBackground();
if (bgDrawable!=null)
//has background drawable, then draw it on the canvas
bgDrawable.draw(canvas);
else
//does not have background drawable, then draw white background on the canvas
canvas.drawColor(Color.WHITE);
// draw the view on the canvas
view.draw(canvas);
//return the bitmap
return returnedBitmap;
}
It will display entire screen including content hidden in your ScrollView
UPDATED AS ON 20-04-2016
There is another better way to take screenshot.Here I have taken screenshot of WebView.
WebView w = new WebView(this);
w.setWebViewClient(new WebViewClient()
{
public void onPageFinished(final WebView webView, String url) {
new Handler().postDelayed(new Runnable(){
#Override
public void run() {
webView.measure(View.MeasureSpec.makeMeasureSpec(
View.MeasureSpec.UNSPECIFIED, View.MeasureSpec.UNSPECIFIED),
View.MeasureSpec.makeMeasureSpec(0, View.MeasureSpec.UNSPECIFIED));
webView.layout(0, 0, webView.getMeasuredWidth(),
webView.getMeasuredHeight());
webView.setDrawingCacheEnabled(true);
webView.buildDrawingCache();
Bitmap bitmap = Bitmap.createBitmap(webView.getMeasuredWidth(),
webView.getMeasuredHeight(), Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
Paint paint = new Paint();
int height = bitmap.getHeight();
canvas.drawBitmap(bitmap, 0, height, paint);
webView.draw(canvas);
if (bitmap != null) {
try {
String filePath = Environment.getExternalStorageDirectory()
.toString();
OutputStream out = null;
File file = new File(filePath, "/webviewScreenShot.png");
out = new FileOutputStream(file);
bitmap.compress(Bitmap.CompressFormat.PNG, 50, out);
out.flush();
out.close();
bitmap.recycle();
} catch (Exception e) {
e.printStackTrace();
}
}
}
}, 1000);
}
});
Hope this helps..!
AFAIK, All of the methods currently to capture a screenshot of android use the /dev/graphics/fb0 framebuffer. This includes ddms. It does require root to read from this stream. ddms uses adbd to request the information, so root is not required as adb has the permissions needed to request the data from /dev/graphics/fb0.
The framebuffer contains 2+ "frames" of RGB565 images. If you are able to read the data, you would have to know the screen resolution to know how many bytes are needed to get the image. each pixel is 2 bytes, so if the screen res was 480x800, you would have to read 768,000 bytes for the image, since a 480x800 RGB565 image has 384,000 pixels.
For newer Android platforms, one can execute a system utility screencap in /system/bin to get the screenshot without root permission.
You can try /system/bin/screencap -h to see how to use it under adb or any shell.
By the way, I think this method is only good for single snapshot.
If we want to capture multiple frames for screen play, it will be too slow.
I don't know if there exists any other approach for a faster screen capture.
[Based on Android source code:]
At the C++ side, the SurfaceFlinger implements the captureScreen API. This is exposed over the binder IPC interface, returning each time a new ashmem area that contains the raw pixels from the screen. The actual screenshot is taken through OpenGL.
For the system C++ clients, the interface is exposed through the ScreenshotClient class, defined in <surfaceflinger_client/SurfaceComposerClient.h> for Android < 4.1; for Android > 4.1 use <gui/SurfaceComposerClient.h>
Before JB, to take a screenshot in a C++ program, this was enough:
ScreenshotClient ssc;
ssc.update();
With JB and multiple displays, it becomes slightly more complicated:
ssc.update(
android::SurfaceComposerClient::getBuiltInDisplay(
android::ISurfaceComposer::eDisplayIdMain));
Then you can access it:
do_something_with_raw_bits(ssc.getPixels(), ssc.getSize(), ...);
Using the Android source code, you can compile your own shared library to access that API, and then expose it through JNI to Java. To create a screen shot form your app, the app has to have the READ_FRAME_BUFFER permission.
But even then, apparently you can create screen shots only from system applications, i.e. ones that are signed with the same key as the system. (This part I still don't quite understand, since I'm not familiar enough with the Android Permissions system.)
Here is a piece of code, for JB 4.1 / 4.2:
#include <utils/RefBase.h>
#include <binder/IBinder.h>
#include <binder/MemoryHeapBase.h>
#include <gui/ISurfaceComposer.h>
#include <gui/SurfaceComposerClient.h>
static void do_save(const char *filename, const void *buf, size_t size) {
int out = open(filename, O_RDWR|O_CREAT, 0666);
int len = write(out, buf, size);
printf("Wrote %d bytes to out.\n", len);
close(out);
}
int main(int ac, char **av) {
android::ScreenshotClient ssc;
const void *pixels;
size_t size;
int buffer_index;
if(ssc.update(
android::SurfaceComposerClient::getBuiltInDisplay(
android::ISurfaceComposer::eDisplayIdMain)) != NO_ERROR ){
printf("Captured: w=%d, h=%d, format=%d\n");
ssc.getWidth(), ssc.getHeight(), ssc.getFormat());
size = ssc.getSize();
do_save(av[1], pixels, size);
}
else
printf(" screen shot client Captured Failed");
return 0;
}
You can try the following library: Android Screenshot Library (ASL) enables to programmatically capture screenshots from Android devices without requirement of having root access privileges. Instead, ASL utilizes a native service running in the background, started via the Android Debug Bridge (ADB) once per device boot.
According to this link, it is possible to use ddms in the tools directory of the android sdk to take screen captures.
To do this within an application (and not during development), there are also applications to do so. But as #zed_0xff points out it certainly requires root.
Framebuffer seems the way to go, it will not always contain 2+ frames like mentioned by Ryan Conrad. In my case it contained only one. I guess it depends on the frame/display size.
I tried to read the framebuffer continuously but it seems to return for a fixed amount of bytes read. In my case that is (3 410 432) bytes, which is enough to store a display frame of 854*480 RGBA (3 279 360 bytes). Yes, the frame in binary outputed from fb0 is RGBA in my device. This will most likely depend from device to device. This will be important for you to decode it =)
In my device /dev/graphics/fb0 permissions are so that only root and users from group graphics can read the fb0. graphics is a restricted group so you will probably only access fb0 with a rooted phone using su command.
Android apps have the user id (uid) app_## and group id (guid) app_## .
adb shell has uid shell and guid shell, which has much more permissions than an app.
You can actually check those permissions at /system/permissions/platform.xml
This means you will be able to read fb0 in the adb shell without root but you will not read it within the app without root.
Also, giving READ_FRAME_BUFFER and/or ACCESS_SURFACE_FLINGER permissions on AndroidManifest.xml will do nothing for a regular app because these will only work for 'signature' apps.
if you want to do screen capture from Java code in Android app AFAIK you must have Root provileges.

How to use flash without stopping camera feed?

I am currently working on a barcode scanning app, which uses the mobile vision api for the majority of processes. I am trying to implement a flash button so that a user may scan in low light, but for some reason the activation of flash freezes the camera feed. Is there some way to start flash with a button while the feed is active? To activate flash without interfering with other threads? Thanks!
i used this code in my custom camera Application.when User clicks the FlashOn Button then Flash will be start.i think this code will help to you.
try this code (OnButton Click) :
private void btnFlashOnClick() {
if (mCamera != null) {
// First get the Camera Parameters.
Camera.Parameters parameters = mCamera.getParameters();
// set FlashMode to camera parameters.
parameters.setFlashMode(Camera.Parameters.FLASH_MODE_TORCH);
// set Parameters Objects to Camera.
mCamera.setParameters(parameters);
// Finally, Start the Preview Of a Camera
mCamera.startPreview(); // this Line is Usefull for MyApp.If you don't need then Remove this Line.
}
}
this code is works fine in my App..Hope this will helps you...(:
It really depends what camera api you are using as there are few.
CameraManager has
void setTorchMode (String cameraId, boolean enabled)
that lets you operate flash regardless of current state of the camera (and without a need to restart one), but it could be overridden by other apps too

Video Recording on Android Using Kivy (Python)

I am trying to record video through Kivy (http://kivy.org/#home) and am not sure what direction or libraries to use.
Currently I have the camera widget working with the code below, which gets the camera to display on the screen, but I am not sure how to get it to record and save the video file. Any help is greatly appreciated!
class MyApp(App):
# Function to take a screenshot
def doscreenshot(self,*largs):
Window.screenshot(name='screenshot%(counter)04d.jpg')
def build(self):
camwidget = Widget() #Create a camera Widget
cam = Camera() #Get the camera
cam=Camera(resolution=(640,480), size=(500,500))
cam.play=True #Start the camera
camwidget.add_widget(cam)
button=Button(text='screenshot',size_hint=(0.12,0.12))
button.bind(on_press=self.doscreenshot)
camwidget.add_widget(button) #Add button to Camera Widget
return camwidget
if __name__ == '__main__':
MyApp().run()
Kivy support only playing video / camera widget. There is nothing in the framework for encoding video and save it into a file.
Try to use directly gstreamer instead, maybe you'll have more chance.

How to get supported video camera resolutions in android?

I am writing an app where I am allowing the user to capture video using the phones camera. I am using my own code to record the video as opposed to Androids built in camera app.
Everything is working OK except I need to be able to access the list of supported camera resolutions so I can choose at runtime which one to use. I am looking for something like getSupportedPictureSizes() but for video. Android 3.0 has this functionality but I am looking for something for 2.2.
As of right now I am using CamcorderProfile.QUALITY_HIGH / QUALITY_LOW, but this only gives me two options and on the phones I have been testing on, the file sizes are at each extreme.(QUALITY_LOW is 216 kb/s and QUALITY_HIGH is > 3 MB/s)
Any help would be greatly appreciated,
Thank You!
Did you try to use the getSupportedVideoSizes() method from Camera.Parameters class?
public List<Camera.Size> getSupportedVideoSizes()
This method returns a list of Size objects. It will return null if the camera does not have separate preview and video output. The answer here indicates that when this returns null you may use the getSupportedPreviewSizes() list.
Ok I think I figured it out. It seems to work correctly on the phones I have been testing on.
List<Size> tmpList = camera.getParameters().getSupportedPreviewSizes();
final List<Size> sizeList = new Vector<Size>();
//compair the apsect ratio of the candidate sizes against the real ratio
Double aspectRatio = (Double.valueOf(getWindowManager().getDefaultDisplay().getHeight()) / getWindowManager().getDefaultDisplay().getWidth());
for(int i=0; i<tmpList.size(); i++){
Double tmpRatio = Double.valueOf(tmpList.get(i).height) / tmpList.get(i).width;
if(Math.abs(aspectRatio - tmpRatio) < .15){
sizeList.add(tmpList.get(i));
}
}

Categories

Resources