I want to remove background color of video in exoplayer or make it transparent i.e if video background color is red then make it transparent and view behind exoplayer must be visible.
Note : I don't want to make changes in Exoplayer background only remove video background which is playing in ExoPlayer
I have made lots of search in google but unfortunate did't got right answer, Please let me know its possible or not and its possible using any other way please direct me.
Thanks in Advance
Exoplayer is not really designed to modify a video file like this while displaying it.
Given that the background may change from frame to frame, I suspect you may find it hard to find any solution that will be able to do this quickly enough on any regular Android device, as there will likely be quite a bit of video processing involved.
If your background is static, like the room behind the speaker in a Zoom or similar conference call, then it may be a bit easier and you could look at OpenCV background substitution techniques:
https://opencv24-python-tutorials.readthedocs.io/en/latest/py_tutorials/py_video/py_bg_subtraction/py_bg_subtraction.html
Most of the examples will be in Python so you will have to explore support in Android OpenCV, which is usually a subset and can be a little tricky to set up (check Q&A on SO for this). Android OpenCV also still uses Eclipse rather than Studio in the documentation at the time of writing, which is something to be aware of.
If you have the luxury of removing the background on the server side before you stream the video to the Android device then things will be easier and you should be able to find up to date example using Python and the OpenCV techniques linked above.
If you use case is a 'Greenscreen' background example, then ffmpeg can also provide you filters to change the background as you wish, including making it transparent. The documentation is here: https://ffmpeg.org/ffmpeg-filters.html#toc-chromakey
It includes an example to change the green screen to transparent in an image (png image in this example):
ffmpeg -i input.png -vf chromakey=green out.png
Related
Lately I was working with MediaRecorder to capture videos and handle them in the output. However as it turns out, there were security restrictions, which didn't allow me to catch the outputstream from the MediaRecorder (the problem presented in the link below):
"Seekable" file descriptor to use with MediaRecorder Android 6.0 (API 23)
So I had to elaborate another solution and decided to work with Camer API and and get the stream there. So the first way was to work with onPreviewFrame, catch the frames in a file and convert colors and formats (MediaCodec). Luckely the problem with color conversion could be circumvented by getting the video from the e.g SuraceTexture, as described e.g. in bigflakes project:
https://bigflake.com/mediacodec/CameraToMpegTest.java.txt
I am not a total newbie in Android Java, but this is really overwhelming me. I dont want a ready receipt for that and I am pretty okay with sitting and working the next whole week and cracking that code, but firstly my question is: how you guys got to understand MediaCodec taking the video from e.g. SurfaceTexture and later put it in MediaMuxer and secondly could you recommend some tutorials, where you begin with the simpliest project on that topic and then gradually expand the code?
I really try to work on bigflakes project, but I am helpless even because the onCreate method is missing.. and the best part begins when he begins to render the video.
Bigflakes MediaCodec page contains mostly tests for MediaCodec, if you still insist on using that as a reference then start from encodeCameraToMpeg() in CameraToMpegTest, also take a look at EncodeAndMux to get an idea on how to set up the MediaCodec encoder.
For a working video capture sample, take a look at taehwandev's MediaCodecExample. For an example on how to decode your recorded video, take a look at the BasicMediaDecode provided in the Google Samples repo.
The advantage of using MediaCodec along with Camera1 API would be that you'll be able to support devices with API level 18 and upwards. If you're only targeting API levels 21 and upwards, then Camera2 should work, here's a Android Camera2Video Sample for you to refer to if needed.
Finally, it might also be worthwhile to look at the new CameraX API, although it shouldn't be used in production yet, that's the direction that android's camera API is moving towards. So it's probably worth taking a look at the official documentation and going through a guide or two (eg: Exploring CameraX) to get the basic idea ahead of time.
NOTE - Do not use CameraX API in production code yet, as
the CameraX library is in alpha stage and its API surfaces aren't yet finalized. I merely provided it as an option for you keep tabs on for future reference.
Hi I have been searchin for a solution on how I can make something like this possible. I am able to make two seperate views and make one grey an the other white but there isn't the same feel. Google uses a kind of shadowy effect in the white one. Is there a library that will allow me to do so?
you're probably talking about elevation, its a lollipop feature.
Material Design: Objects in 3D space - Elevation
there are many ways to replicate it on older versions of android, you could make your own png background image.
get creative with it, it's a lot of fun
Hopefully not making this too vague... but I have been dealing with a lot of image scaling and manipulating with an app I'm working on and wanted to know:
Is it possible/feasible to warp images using java code and if so, is it possible? I have read up on JAI but can't seem to grasp it very well. Is there any form of built in implementation that would work with Android 2.3 or higher?
Any tutorials or examples that someone may have come across would be a great help as I have been researching for a while and can't seem to gain any ground.
End goal: to be able to warp an image (point to point by pixels) in multiple places and then saving the bitmap. This would be processed behind the scenes and show the user the end result.
do you have an example of the kind of warping that you want to do? It's certainly feasible but you'll probably end up doing pixel-by-pixel manipulation to generate the warped image.
there is a previous discussion here:
android image processing tutorial?
Is it possible to capture the entire screen from Android application code? I'm developing an application like VNC for Android platform.
Regards
I think that depends on what you are trying to capture. I'm sure you can use Moss's method to create a screenshot from your own application - that is, something you render yourself.
As I understand it however, capturing from other views, apps, etc. is designed to be impossible for security reasons. This is to avoid apps being able to take screen shots from other apps, which would make it easy to steal sensitive data.
yes it is. You just need to create a canvas and assign it a Bitmap, then draw to that canvas instead of the canvas you use in your onDraw method and save the bitmap on the SDcard for example.
Just to renind you that this method will work if you handle the drawing, so you should use a custom home screen for it to capture wether you want. (just get the default android home screen :D).
I don't have personal experience with it, but this open source project sounds like it might either solve your problem, or provide you clues as to which API to use:
http://sourceforge.net/projects/ashot/
Screen capturing tool for Android
handsets connected via USB to a
desktop/laptop. It is great for
fullscreen presentations, product
demos, automatic screen recording, or
just a single screenshot. Without
root.
I need simple video playback in Java.
Here are my requirements:
PRODUCTION QUALITY
Open and decode video files whose video and audio codecs can be chosen by me. I.E I can pick well behaving codecs.
Be able to play, pause, seekToFrame OR seekToTime and stop playback. Essentially I wish to be able to play segments of a single video file in a non linear fashion. For example I may want to play the segment 20.3sec to 25.6sec, pause for 10 seconds and then play the segment 340.3sec to 350.5sec, etc.
During playback, video and audio must be in sync.
The video must be displayed in a Swing JComponent.
Must be able to use in a commercial product without having to be open source (I.E. LGPL or Comercial is good)
My research has led me to the following solutions:
Use Java Media Framework + Fobs4JMF
http://fobs.sourceforge.net/f4jmf_first.html
I have implemented a quick prototype and this seems to do what I need. I can play a segment of video using:
player.setStopTime(new Time(end));
player.setMediaTime(new Time(start));
player.start();
While Fobs4JMF seems to work, I feel the quality of the code is poor and the project is no longer active. Does anyone know of any products which use Fobs4JMF?
Write a Flash application which plays a video and use JFlashPlayer to bring it into my Java Swing application
Unlike Java, Flash is brilliant at playing video. I could write a small Flash application with the methods:
open(String videoFile),
play(),
pause(),
seek(int duration),
stop()
Then bring it into Java using JFlashPlayer which can call Flash functions from Java.
What I like about this solution is that video playback in Flash should be rock solid. Has anyone used JFlashPlayer to play video in Java?
Write a simple media player on top of Xuggler
Xuggler is an FFMpeg wrapper for Java which seems to be a quite active and high quality project. However, implementing the simple video playback described in the requirements is not trivial (Seeking in particular) but some of the work has been done in the MediaTools MediaViewer which would be the base upon which to build from.
Use FMJ
I have tried to get FMJ to work but have had no sucess so far.
I would appreciate your opinions on my problem.
Can a brother get a shout out for Xuggler?
In my mind, VLCJ is the way forward for this type of thing. I love Xuggler for encoding / transcoding work, but unfortunately it's just so complicated to do simple playback and solve all the sync issues and suchlike - and it does very much feel like reinventing the wheel doing so.
The only thing with VLCJ is that to get it to work reliably with multiple players I've had to resort to out of process players. The framework wasn't the simplest thing in the world to get in place, but when it's there it works beautifully. I'm currently running 3 out of process players in my app side by side with no problems whatsoever.
The other caveat is that the embedded media player won't work with a swing component, just a heavyweight canvas - but that hasn't proven a problem for me at all. If it does, then you can use the direct media player to get a bufferedimage and display that on whatever you choose, but it will eat into your CPU a bit more (though no more than other players that take this approach.)
JavaFX has a number of working video and audio codecs builtin. It's likely to be the solution with the broadest support at the moment.
I've been using jffmpeg in the same way you use FOBS, it works pretty well, although I haven't compared them.
I would also love to see an easy way to interface with native codecs the way that JavaFX does, but there doesn't seem to be real integration between JavaFX and Java.
There has also been some work trying to get the VLC library libvlc into java. I haven't tried it yet and would be interested to hear back from anyone who has.
haven't tried Xuggler (which i'm interested in) but I'm having a good time with VLCJ. The drawback I find in it is only that you have to have VLC installed prior to your application.
I'd recommend using MPV. You can use it in combination with JavaFX quite easily, see this example.
In short, you use a little JNA magic to use the MPV native libaries directly, and then let the video display on a JavaFX stage. If you use a child stage, you can even overlay JavaFX controls on top of the video (with full transparancy support).
VLC (with VLCJ) can be used in a similar fashion, but I find that the MPV solution performs better (faster seek and start times).