What library can I use to encode video in a Java Applet? - java

I would like to record the user's interaction in my Java Applet as a video to send (potentially stream) to my server with the intention of uploading to Youtube (or similar). A high frame-rate is not required (a couple frames per second is sufficient).
Minimizing the bandwidth used is preferred, so sending jpeg snapshots to the server and encoding server-side is my last resort.
Are there any lightweight Java video encoding libraries available that don't require native code?

I'm new to java so don't take this to seriously :)
I guess a good start with video encoding in java is Java Media Framework.
I haven't tried it, so I don't know what's they're support on flv encoding.
Since Flash Media Server is commercial, couldn't you use Red5 ?
You would have a swf, not an applet, but you will get a broader percentage of viewers since Flash Player is pretty wide spreaded.
And Alex has a good point, since you need to upload the video to youtube, why not use they're API ?
hth

Xuggler can be used to encode pretty much any format from Java, but it requires a native component to be installed with it. There isn't an applet version available in the easy to use download, but some users have built custom versions of FFmpeg and Xuggler that they have used in downloadable applications. Try asking on the xuggler-users user group to see if others will help.

You can encode your images into H.264/MP4 this way it would be immediately good for web streaming. To upload it in parallel to recording you can break up your sequence into small chunks, let's say 25-100 images each and upload each chunk as a separate movie.
You can do it in pure Java without any native code, just use JCodec ( http://jcodec.org ). Here's a handy class that you can use:
public class SequenceEncoder {
private SeekableByteChannel ch;
private Picture toEncode;
private RgbToYuv420 transform;
private H264Encoder encoder;
private ArrayList<ByteBuffer> spsList;
private ArrayList<ByteBuffer> ppsList;
private CompressedTrack outTrack;
private ByteBuffer _out;
private int frameNo;
private MP4Muxer muxer;
public SequenceEncoder(File out) throws IOException {
this.ch = NIOUtils.writableFileChannel(out);
// Transform to convert between RGB and YUV
transform = new RgbToYuv420(0, 0);
// Muxer that will store the encoded frames
muxer = new MP4Muxer(ch, Brand.MP4);
// Add video track to muxer
outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, 25);
// Allocate a buffer big enough to hold output frames
_out = ByteBuffer.allocate(1920 * 1080 * 6);
// Create an instance of encoder
encoder = new H264Encoder();
// Encoder extra data ( SPS, PPS ) to be stored in a special place of
// MP4
spsList = new ArrayList<ByteBuffer>();
ppsList = new ArrayList<ByteBuffer>();
}
public void encodeImage(BufferedImage bi) throws IOException {
if (toEncode == null) {
toEncode = Picture.create(bi.getWidth(), bi.getHeight(), ColorSpace.YUV420);
}
// Perform conversion
transform.transform(AWTUtil.fromBufferedImage(bi), toEncode);
// Encode image into H.264 frame, the result is stored in '_out' buffer
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, toEncode);
// Based on the frame above form correct MP4 packet
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
// Add packet to video track
outTrack.addFrame(new MP4Packet(result, frameNo, 25, 1, frameNo, true, null, frameNo, 0));
frameNo++;
}
public void finish() throws IOException {
// Push saved SPS/PPS to a special storage in MP4
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
// Write MP4 header and finalize recording
muxer.writeHeader();
NIOUtils.closeQuietly(ch);
}
}

Why do you need to send the images or video form directly? Sounds like a big bandwidth expense. Just serialize and send the stream of UI events with timestamps, and reconstruct what the user should be seeing on your server later (some visual details may depend on the user's machine/setup, but your applet ain't gonna be able to get to them decently anyway).

Related

VLCJ dynamically modify frames while streaming

My project has two parts. First one is to stream a video and second is to capture streamed video and analyze it. it's been done the capture and analyze side.
While i'm doing this, I've made a workaround for my question; i convert video into BufferedImages and saved to image files. Then i edit images files and convert back to video again. Finally i can stream edited video and do the other capture and analyze part.
But since this workaround is a long-timed process and frame edits must be done simultaneously and dynamically according to my real-life use, i must interfere just before the frames are being streamed.
I made lots of searching but maybe i'm not familiar with streaming and its terms and couldn't find a way to do it with vlcj and java. Actually, to solve this problem, I don't necessarily have to stick with java and vlcj. If vlcj doesn't provide a solution, all other suggestions are will be fine.
And here is the code i use to stream from a video file below;
private void Stream () throws InterruptedException {
String media = "C:\\someVideo.mp4";
String options = formatRtpStream("127.0.0.1", 5555);
MediaPlayerFactory mediaPlayerFactory = new MediaPlayerFactory();
EmbeddedMediaPlayer mediaPlayer = mediaPlayerFactory.mediaPlayers().newEmbeddedMediaPlayer();
mediaPlayer.media().prepare(media,
options,
":no-sout-rtp-sap",
":no-sout-standard-sap",
":sout-all",
":sout-keep"
);
}
private static String formatRtpStream(String serverAddress, int serverPort) {
StringBuilder sb = new StringBuilder(200);
sb.append(":sout=");
sb.append("#transcode{vcodec=h264,vb=1000,fps=25,scale=1,noaudio}");
sb.append(":rtp{dst=");
sb.append(serverAddress);
sb.append(",port=");
sb.append(serverPort);
sb.append(",mux=ts}");
return sb.toString();
}

Download part of a video file through a servlet

So I have a link to a video online (e.g. somewebsite.com/myVideo.mkv) and I want to download that video on the server through a servlet. The video file has CDN enabled, so basically any public user can just put the link into the browser and it will start playing. This is the code I have so far.
downloadFile(URL myURL){
InputStream input = myURL.openStream();
File video = new File ("/path-to-file/" + myURL.getFile());
FileOutputStream output = new FileOutputStream(output);
byte[] buffer = new byte[1024];
int read;
// Write full range.
while ((read = input.read(buffer)) > 0){
output.write(buffer, 0, read);
}
output.close();
input.close()
}
If I do that, it would download the entire video file from the URL and the video playback fine. However, if I want to specify a specific byte range on the video downloadFile(URL myURL, long startByte, long endByte), the video doesn't playback. I used the function input.skip() to skip forward to the startByte but I suspect it skips over some important header of the mkv format. That's why the player can't recognize it. Does anyone know how to do this in java?
There are 3 dominant HTTP streaming techologies: Apple HTTP Live Streaming, Microsoft Smooth Streaming, and Adobe HTTP Dynamic Streaming. Each of these technologies provides tools to convert video to corresponding format. If you start with one large video file, the Apple and Adobe tools would create a number of small files containing, say, 10 sec of video each, and a playlist file that would give the client a clue how to read them. I believe Microsoft tools actually can generate a single file, but it would contain small video fragments internally.
With the HTTP streaming, the "intelligence" lives in the client that knows how to read the master playlist file and how to get around either numerous media files or numerous media file fragments. The HTTP server only have to serve a file or a file fragment specified by the Range header.

How to read raw values of 3gp / AMR-NB audio format?

In my Android application I am recording the user's voice which I save as a .3gp encoded audio file.
What I want to do is open it up, i.e. the sequence x[n] representing the audio sample, in order to perform some audio signal analysis.
Does anyone know how I could go about doing this?
You can use the Android MediaCodec class to decode 3gp or other media files. The decoder output is standard PCM byte array. You can directly send this output to the Android AudioTrack class to play or continue with this output byte array for further processing such as DSP. To apply DSP algorithm the byte array must be transform into float/double array. There are several steps to get the byte array output. In summary it looks like as follows:
Instantiate MediaCodec
String mMime = "audio/3gpp"
MediaCodec mMediaCodec = MediaCodec.createDecoderByType(mMime);
Create Media format and configure media codec
MediaFormat mMediaFormat = new MediaFormat();
mMediaFormat = MediaFormat.createAudioFormat(mMime,
mMediaFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE),
mMediaFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT));
mMediaCodec.configure(mMediaFormat, null, null, 0);
mMediaCodec.start();
Capture output from MediaCodec ( Should process inside a thread)
MediaCodec.BufferInfo buf_info = new MediaCodec.BufferInfo();
int outputBufferIndex = mMediaCodec.dequeueOutputBuffer(buf_info, 0);
byte[] pcm = new byte[buf_info.size];
mOutputBuffers[outputBufferIndex].get(pcm, 0, buf_info.size);
This Google IO talk might be relevant here.

Looking for pure Java API to read metadata from an MP4 file [duplicate]

Mp3 files can be handled using this mp3 SPI support, but I'm not finding something similar to mp4 files.
Any help would be appreciated.
--UPDATE
What I want to do is get the file's size, as I do with wave files using this code:
AudioInputStream audioInputStream = AudioSystem.getAudioInputStream(file);
AudioFormat format = audioInputStream.getFormat();
long audioFileLength = file.length();
int frameSize = format.getFrameSize();
float frameRate = format.getFrameRate();
float durationInSeconds = (audioFileLength / (frameSize * frameRate));
--ANSWER
Here is the answer code using the hint of #mdma (IBM toolkit):
/**
* Use IBMPlayerForMpeg4SDK to get mp4 file duration.
*
* #return the mp4File duration in milliseconds.
*/
public static long getMp4Duration(File mp4File) throws IllegalStateException, IOException {
PlayerControl playerControl = PlayerFactory.createLightweightMPEG4Player();
playerControl.open(mp4File.getAbsolutePath());
long mili = playerControl.getDuration();
// int sec = (int) ((mili / 1000) % 60);
// int min = (int) ((mili / 1000) / 60);
// System.out.println("IBM Tookit result = " + min + ":" + sec);
return mili;
}
--
Related, language independent, question:
Anyone familiar with mp4 data structure?
Mp4 is a container format - to be able to find the duration of the audio inside, you have to first parse the content out of the container. You can extract the content of an mp4 file using isobox mp4parser.
Once you've done that, you then have the raw audio data. If it's one of the supported formats in java (wav, mp3, etc..) then you can just open this file like you have done for wavs already. Initially you will probably extract the audio to a separate file, for simplicity's sake and easier debugging. When this is working, you can then do the extraction inline - you implement an InputStreamFilter that extracts the audio content from the mp4 on the fly, so no additional external files are required.
IBM Alphaworks have a pure java MP4 decoder library available, but it's possibly overkill for your present needs.
Xuggler ( http://www.xuggle.com/xuggler/ ) provides about the best Java wrapper for FFMPEG that I've seen - it'll let you decode the images out of almost any file, and then do whatever you like with them.
You don't specify what you want to do. If you just want to play the files, you can use MPlayer and control it remotely via the ProcessBuilder API and stdio.

Xuggler-Java- How to create packets from a byte array?

I am building an export application using Xuggler that exports a h264 encoded recording so that it can be played in an external player ( writing the video recording to .avi or .mp4 container).
I am interested to know how one could create a IPacket from a byte array representing a video frame. What parameters from the IPacket need to be set and what values should those contain?
And again what parameters should be set and what should be their values for the container that gathers the packets?
packet = IPacket.make( IBuffer.make( null, data, 0, data.length ));
packet.setTimeStamp( time );
packet.setTimeBase( IRational.make(1,1000) );
int pksz = packet.getSize();
packet.setComplete(true, pksz);

Categories

Resources