I want to create a video from sequence of images, the images are of same size and I don't want to add any audio file to it so that's going to not hard.
What I need to do for that please explain, what line of code should I need to use for this, or if you have the sample code for it?
I suggest use ffmpeg. https://github.com/bramp/ffmpeg-cli-wrapper
This is a wapper library for java.
Related
I am new to android studio and I'm doing my university project in it. Essentially what I am trying to do is allow the user to import an mp3 file from their device and then generate a static waveform image from the mp3 file.
I have searched around but all I can seem to find is animated visualizers that illustrate the waveform as the mp3 is playing on the device. I just want a static image that the user can save to their phone.Please see attached image.
waveform image
If anyone can assist me with this or direct me towards some resources that I could use it would be greatly appreciated.
The best library you can use is FFMPEG, its an application you run by command line.
Here some source code for the library: source 1, source 2
About the waveform image, ffmpeg have some resources to extract images from video/audio, you have to search for the command and see how to do it.
I never made something like that, so I don't know the exactly command but I know the better library you can use to make what you want to make, is ffmpeg.
If you have any other question about ffmpeg, feel free to ask me.
So, I was recommended GStreamer to create video files. I was going over their tutorial for creating a video file.
The problems I encountered are:
How do I create an AVI file rather than a YUV something.
What is the source being used there?
I want to give a set of BufferedImages or anything else that will show what was going on the screen. I have previously used JPEGtoMovie provided bu the Java guys and for that I had to first save all the images to the disk as JPEG, sort them into their correct order from lexicographical order and a whole lot more.
I was planning to avoid that and that is why I was thinking of Vector<BufferedImage> or BlockingArrayQueue<BufferedImage>
Which all plug-ins do I need from GStreamer to create the AVI output?
Sorry I have been asking too many questions today. I have never worked with a media framework before and I am very dumb
The command gst-inspect will list all included elements (components).
you can produce an avi file from the pipeline: videotestsrc ! encoder ! avimux ! filesink where encoder stands for the encoding element you'd like to use
an alternative would be to use: videotestsrc ! encodebin ! filesink; here you just build a profile and encodebin will figure our what encoder and what muxer to use to create the format specified in the profile
I did not understood the part around the BufferImages. You can feed images manually to gstreamer (e.g. using [appsrc ! decodebin] instead of [videotestsrc]), but thats a last resort. There are also elements such as multifilesrc that read a sequece of images. Maybe you can give more details what you want to do (where do the source frame come from).
I wanna split a video file into several fragments which can be played individually.
Is there any java library can be used in this situation?
Or if Xuggle can be used, can anyone give me an simple example.
Thanks
A video file? There's quite a few video codecs, file formats. For a general solution try ffmpeg it's not a Java library but runs on all platforms.
I need to convert mp4/flv files info mp3 in my Android application, but I don't know C/C++ and Android NDK. Do you know libraries/methods for easy converting on Java? Thank you for anyway.
Your question is how to extract audio from MP4/ FLV files and save as mp3 file. Right ?
Then, very sorry, Android SDK does not provide any API for transformating or track extraction.
Also using available media framework to achieve the same is also not trivial (and even if you do, you will lose portability).
What I would suggest is to use your MP4 & FLV Parser to extract audio track, do transcoding (if audio track is non-mp3), and save the transcoded (if audio track extracted is mp3, then extracted data) data.
Or you can port FFMPEG code base and use the same. This again may be overkill for your small task.
Suppose you just want to extract mp3 track from MP4, then you understand the native mp4 parser and use the APIs for extraction. You may have to replicate some code from stagefright / opencore.
Shash
it's probably irrelevant for you anymore but if some one still need a mp4 to mp3 parser here's an api that can do the job
I'm searching a java framework for manipulate audio and video files, I need functions like:
Split video and audio files
Get a frame from a video
Key Frame extraction
I tried Xuggle and I want to know if there are other frameworks.
any advice?
I found Xuggler: http://www.xuggle.com/xuggler/
Which audio and video format you are dealing with? For MPEG-2, Project X may be a solution.