Custom Video Enhancement Java - java

I have to create a media player in java which must play all known formats of audio/video. Apart from usual tasks like play/pause/stop/rewind etc., I need to apply some enhancement to the video like histogram equalization etc. A few enhancements are from some latest whitepapers, so I dont expect any api to have support for that. I would need to implement it on my own.So I am looking for some good apis which would help me to implement these custom image manipulation techniques on a per-frame basis.
I have seen many stackoverflow questions about java video player/manipulation and they have suggested to stay away from JMF and use others like vlcj,xuggler GStreamer etc. But I am not sure whether either of them supports custom enhancement techniques. Also the last question related to the topic was answered about 2 years ago(among those I could find). Also to play the video, I need some component. I have not explored the javafx MediaView about it.
So anyone has any experience about it that can we apply custom filters/enhancement to video played via javafx? And if not, then what alternatives I have to implement?

Related

Use Java or other languages in a Flutter Application

Since I got no answer and not much feedback on this question: Android Flutter Analyze Audio Waveform
and found nothing online about what I'm looking for, I'll simply ask a broader question, since a comment on that answer told me to use native code and use a platform channel to connect it to flutter but when I asked some clarifications I got nothing.
So my question is If I can do operations in Java (which has been around since a much longer time, and thus has a way bigger documentation), and then use the outcome in Flutter.
More precisely, could I do these things in Java and Flutter:
1) Analyse Audio waveform and find peak points in specific frequencies, and use the timestamp to display them in flutter;
Edit 1:
What Are Peak Points?
This Is the waveform of different frequencies ranges (The Orange one is bass (80-255Hz)), and the Points circled in black are Peak Points. I should analyze the audio specter of a song and Find the peak points in certain frequencies. Then When I Find the Peaks I need to save the timestamps, for example, 16 seconds in and so on.
2) Edit 2:
I need to Edit some photos in a video, like a video collage, for which each frame of a 30 or 60fps video is an image.
3) Edit 3:
I need to add basic frame specific effects to the video, for example a blur that will change frame to frame, or a glare.
4) Adding Music to that video and save it to an mp4 or avi or any format.
5) Edit 4: Most Important thing, I don't want to this all in real time, but more like an After Effect like Render process, in which all the frames are rendered together. The Only thing that would be nice is a sort of progress bar telling the user that the Render is at frame, for example, 200 of 300, but I don't want to display any of the frames or the video, just to render it in background and then save it to an mp4 video that can be viewed after.
As You can see it's a difficult process to do in a language to which you hardly find a tutorial on how to play music due to its early state. But Uis and some other things in flutter are way easier to do and Flutter is also Multi-Platform. So I prefer to stick to Flutter language.
Edit 5:
I took a look at Qt and JUCE, and found out that Qt seems a valid alternative but it seems for what understood more like a "closed" system, I mean, for example I looked the multimedia library but for what I've understood, you can do basic stuff, for example play a video, but not collage frames and save it. (Don't know if I explained myself well). JUCE On the other side, looks better but it seems more for PC audio VST than for mobile applications including video rendering. And another thing is that these two are not free and open source like Flutter is.
Then There is Kivy, which could and could not be the best, because it is a Python port for Mobile Devices and I have a lot of experience with Python And I think it's one of the easier language to learn, but on the other side, it hasn't got that much UI power. and as you mentioned there could be problem using libraries on Android.
You stated I could use C++ or Java With Flutter, but with C++ you told that it's a difficult process. So My question turned out to be Could I write the process in java with a Normal Android Application And Then in some way use the functions in a Flutter App?
Edit 6:
I found a possivle alternative:
Kha (http://kha.tech/). But again found nothing on how to use it with Flutter. Could it be a good Idea?
I'm asking more of a confirmation on if I could use Java or any other language to do what I need in a Flutter Application. And If yes if it's complicated or not that much. (I'm a beginner sorta). But Some tutorial or links to kickstart the code would be helpful aswell!
Flutter at this time is great for building UIs but as you've mentioned, it doesn't have a lot of power or compatibility with libraries yet. A good part of the reason for that is that it doesn't have easy integration with c++, but I won't get into that now.
What you're asking is most likely possible but it's not going to be simple at all to do. First, it sounds like you're wanting to pull particular frames from a video, and display them - that's going to be an additional complication. And don't forget that on a mobile device you have somewhat limited processing power - things will have to be very asynchronous which can actually cause problems for flutter unless you're careful.
As to your points:
This is a very general ask. I'd advise looking up android audio processing libraries. I'm almost sure it's possible, but SO questions are not meant for asking advise on which framework to use. Try https://softwarerecs.stackexchange.com/.
Once again, fairly general and a bit unclear about what you're asking... Try sofwarerecs. I assume you're wanting to take several frames and make them into a video?
Some of those effects (i.e. zoom) you could definitely do with flutter using a Transform. But that would just be while playing in flutter rather then adding to the video files themselves. To do that, you'll have to use the video library in android/java code.
Once again, the video library should do this.
This should also be part of the video library.
I do know of one audio/video library off the top of my head called Processing that may do what you need, but not for sure. It does have an android sdk though. OpenCV would be another but only for video/image processing and I haven't used it directly with Java so I'm not sure how easy it is to use.
For how you'd actually go about implementing this along with flutter... you're going to need to use Platform Channels. I mentioned them in the comment to your other answer but figured you could look that up yourself. The documentation does do a much better job of explaining how that works and how to set it up than I can. But the TLDR is that essentially, what they allow you to do is to send serialized data from native code (java/kotlin/swift etc) to flutter code (dart) and vice-versa, which gets translated into similar data structures in the target language. You can set up various 'channels' upon which the data flows, and within those channels set up 'methods' which get called at either end, or simply send events back and forth.
The complication I mentioned at the beginning is that sending images back and forth across the channels between flutter and dart isn't all that optimal. You most likely won't get a smooth 24/30/60fps of images being sent from java to dart, and it might slow down the rest of the flutter ui significantly. So what you'll want to use for the actual viewport is instead a Texture, which simply displays data from the android side. You'll have to figure out how to write to a texture from android yourself, but there's lots of information available for that. Controls, the visualization of the audio, etc can be done directly in flutter with data that is retrieved from native.
What you'll have is essentially a remote control written in dart/flutter, which sends various commands to a audio/video processing library & wrapper code in Java.
If all that sounds complicated, that's because it is. And as much as flutter is very nice to build UIs in, I have doubts as to whether it's going to be worth the extra complications if you're only targeting android.
Not really related to the answer but rather some friendly advice:
There is one other thing I'll mention - I don't know your level of proficiency with programming and with different languages, but video/audio processing and such are generally not done in java but rather in actual native code (i.e. c/c++). As such, there are actually two levels of abstraction you're going to have to be dealing with here (to some degree as it will probably be abstracted somewhat or a lot depending on the library you're using) - c/c++ to java and java to dart.
You may want to cut out the middlemen and work more directly with native - in that case I'd recommend at least taking a look at Qt or JUCE as they may be more suitable than flutter for your particular use case. There's also Kivy (uses python) which may work well as there's a ton of image/video/audio processing libraries for Python somehow... although they may not all work on android and still have the c++ => python translation to some degree. You'll have to look into licensing etc though - Qt has a broad enough OS licence for most android apps, but JUCE you'd have to pay for unless you're doing open source. I'd have to recommend Qt slightly more than the others as it actually has native decoding of video frames etc, although you'd probably want to incorporate OpenCV or something for the more complicated effects you are talking about. But it would probably be on the same level of complicated as simply writing in java code, but with a slightly different UI style & easier integration with c++ libraries.

Video capture in JavaFX

This is actually a question about the Java FX feature request process, and how to understand the information on JIRA. I find the jungle of forums and roadmaps and discussions completely unnavigable.
Specifically, the feature I'm interested in is video (e.g., webcam) capture:
JavaFx has come a long way, and I've recently found its realtime video capabilities to be functioning well. However, without video capture it cannot be described as a real rich media development platform. The feature request at JIRA is 5 years old and there are many frustrated users commenting on this continued absence: https://bugs.openjdk.java.net/browse/JDK-8090438. If I understand this correctly, this page tells me that Java FX 9 is when this feature will be introduced.
My question is: how do I know for sure when to expect video capture implementation in Java FX (if at all)?
I'm afraid that 'for sure' nobody can tell us when video support is integrated - not even the guys from oracle.
If you need video support now i would recommend openimaj from http://www.openimaj.org. They support the software for win, linux and arm.
My question is: how do I know for sure when to expect video capture implementation in Java FX (if at all)?
Simple answer - you don't. I wouldn't even expect it in 9 (they generally use that as a placeholder to mean some future release, rather than guaranteeing it'll be in JFX9.)
The JavaFX media capabilities are much better than they once were, but still lacking; there certainly doesn't appear to be a dedicated media team working on JFX as there once was. If you really need something like this, I'd suggest using an external library (VLCJ works well for me, and has much better format support for playback than JFX too.)

What is the best approach to building a custom video player web browser plugin?

I need to stream multiple videos in a web browser and have them all be synchronized; I also need to be able to switch between audio channels on the fly. Despite a lot of research, it looks as though at present this is impossible using the current browser implementations of HTML5 video. Flash seems to have the same problem. There are forums full of people wanting to do on-the-fly, in-band audio and video switching with sync, and multi-video playback with sync, but no real way to do it.
It's in the HTML5 spec for the future, actually, but has no browser implementation yet.
So, I need to build a custom plugin or application.
What is the best approach to doing this? Would something like Google Web Toolkit be a good place to start? Is client-side Java a good approach to building something this custom but with ease of deployability (vs say OS-specific C++ plugins, for example)?
JavaFX 2(An official supported Java library) has a video player which I've seen modified interesting in a 3d world and basically doing what you want. Currently it only supports flv, but they probably change it like it was in JavaFX 1 and support what ever installed codecs on ones computer, once out of beta.
Edit: Checking over the video stuff again not totally sure on keeping things in sync when this is on there roadmap (http://javafx.com/roadmap/)
Synchronized Media and Animations
Sometimes applications need to have very tightly aligned media and
animation in a timeline. JavaFX 2.0 will provide support to tie a
timeline to a specific media stream such that events in the timeline
occur in sync with events in the media stream.

audio editing web app

I want to develop, as a part of my project, a web application that will enable users to edit, mix and apply effects to audio. I am aware of the J2EE development pattern, but am not sure where to start from. I want Audacity like functionality (actually, sox). Is there any Java API for audio editing/mixing and applying effects? If yes, can one write new effects? This would allow me to dynamically generate effects chains, based on the users' input. I searched the web, but it's all "learn to use audacity..." there. Also, is there any way such effects can be applied in (near) real time?
I found Soundation and Myna to be good apps that already do what I want( though Myna has no real time effects support), but they've both got the same old audio editing UI.
Thanks in advance!
Maybe you can look at this:
"Movie Masher going open source after Adobe supported it thus allowing integration to your site too." Maybe your solution is here : http://www.moviemasher.com/
You should check these options below:
Web Audio API
http://www.w3.org/TR/webaudio/
And also
WebPD:
https://github.com/sebpiq/WebPd

Video playback in Java ( JMF, Fobs4JMF, Xuggler, FMJ )

I need simple video playback in Java.
Here are my requirements:
PRODUCTION QUALITY
Open and decode video files whose video and audio codecs can be chosen by me. I.E I can pick well behaving codecs.
Be able to play, pause, seekToFrame OR seekToTime and stop playback. Essentially I wish to be able to play segments of a single video file in a non linear fashion. For example I may want to play the segment 20.3sec to 25.6sec, pause for 10 seconds and then play the segment 340.3sec to 350.5sec, etc.
During playback, video and audio must be in sync.
The video must be displayed in a Swing JComponent.
Must be able to use in a commercial product without having to be open source (I.E. LGPL or Comercial is good)
My research has led me to the following solutions:
Use Java Media Framework + Fobs4JMF
http://fobs.sourceforge.net/f4jmf_first.html
I have implemented a quick prototype and this seems to do what I need. I can play a segment of video using:
player.setStopTime(new Time(end));
player.setMediaTime(new Time(start));
player.start();
While Fobs4JMF seems to work, I feel the quality of the code is poor and the project is no longer active. Does anyone know of any products which use Fobs4JMF?
Write a Flash application which plays a video and use JFlashPlayer to bring it into my Java Swing application
Unlike Java, Flash is brilliant at playing video. I could write a small Flash application with the methods:
open(String videoFile),
play(),
pause(),
seek(int duration),
stop()
Then bring it into Java using JFlashPlayer which can call Flash functions from Java.
What I like about this solution is that video playback in Flash should be rock solid. Has anyone used JFlashPlayer to play video in Java?
Write a simple media player on top of Xuggler
Xuggler is an FFMpeg wrapper for Java which seems to be a quite active and high quality project. However, implementing the simple video playback described in the requirements is not trivial (Seeking in particular) but some of the work has been done in the MediaTools MediaViewer which would be the base upon which to build from.
Use FMJ
I have tried to get FMJ to work but have had no sucess so far.
I would appreciate your opinions on my problem.
Can a brother get a shout out for Xuggler?
In my mind, VLCJ is the way forward for this type of thing. I love Xuggler for encoding / transcoding work, but unfortunately it's just so complicated to do simple playback and solve all the sync issues and suchlike - and it does very much feel like reinventing the wheel doing so.
The only thing with VLCJ is that to get it to work reliably with multiple players I've had to resort to out of process players. The framework wasn't the simplest thing in the world to get in place, but when it's there it works beautifully. I'm currently running 3 out of process players in my app side by side with no problems whatsoever.
The other caveat is that the embedded media player won't work with a swing component, just a heavyweight canvas - but that hasn't proven a problem for me at all. If it does, then you can use the direct media player to get a bufferedimage and display that on whatever you choose, but it will eat into your CPU a bit more (though no more than other players that take this approach.)
JavaFX has a number of working video and audio codecs builtin. It's likely to be the solution with the broadest support at the moment.
I've been using jffmpeg in the same way you use FOBS, it works pretty well, although I haven't compared them.
I would also love to see an easy way to interface with native codecs the way that JavaFX does, but there doesn't seem to be real integration between JavaFX and Java.
There has also been some work trying to get the VLC library libvlc into java. I haven't tried it yet and would be interested to hear back from anyone who has.
haven't tried Xuggler (which i'm interested in) but I'm having a good time with VLCJ. The drawback I find in it is only that you have to have VLC installed prior to your application.
I'd recommend using MPV. You can use it in combination with JavaFX quite easily, see this example.
In short, you use a little JNA magic to use the MPV native libaries directly, and then let the video display on a JavaFX stage. If you use a child stage, you can even overlay JavaFX controls on top of the video (with full transparancy support).
VLC (with VLCJ) can be used in a similar fashion, but I find that the MPV solution performs better (faster seek and start times).

Categories

Resources