So I have a link to a video online (e.g. somewebsite.com/myVideo.mkv) and I want to download that video on the server through a servlet. The video file has CDN enabled, so basically any public user can just put the link into the browser and it will start playing. This is the code I have so far.
downloadFile(URL myURL){
InputStream input = myURL.openStream();
File video = new File ("/path-to-file/" + myURL.getFile());
FileOutputStream output = new FileOutputStream(output);
byte[] buffer = new byte[1024];
int read;
// Write full range.
while ((read = input.read(buffer)) > 0){
output.write(buffer, 0, read);
}
output.close();
input.close()
}
If I do that, it would download the entire video file from the URL and the video playback fine. However, if I want to specify a specific byte range on the video downloadFile(URL myURL, long startByte, long endByte), the video doesn't playback. I used the function input.skip() to skip forward to the startByte but I suspect it skips over some important header of the mkv format. That's why the player can't recognize it. Does anyone know how to do this in java?
There are 3 dominant HTTP streaming techologies: Apple HTTP Live Streaming, Microsoft Smooth Streaming, and Adobe HTTP Dynamic Streaming. Each of these technologies provides tools to convert video to corresponding format. If you start with one large video file, the Apple and Adobe tools would create a number of small files containing, say, 10 sec of video each, and a playlist file that would give the client a clue how to read them. I believe Microsoft tools actually can generate a single file, but it would contain small video fragments internally.
With the HTTP streaming, the "intelligence" lives in the client that knows how to read the master playlist file and how to get around either numerous media files or numerous media file fragments. The HTTP server only have to serve a file or a file fragment specified by the Range header.
Related
We are recording some audios on Android phones which are having the following format as per vlc information:
Codec: MPEG AAC Audio (mp4a)
Language: English
Type: Audio
Channels: Stereo
Sample rate: 32000 Hz
Bits per sample: 32
The above files were not playing on the safari browser with mime type audio/mpeg, But as soon as we changed the mime to audio/mp4 it started playing on safari browser.
For Android we are using API to stream this file using api resource as follows :
#GET
#UnitOfWork
#Produces("audio/mpeg")
#Path("/getaudiofile/{fileId}")
public Response getPart(#Auth AuthUser authUser, #PathParam("fileId") Long fileId) {
File audioFile = new File(filesTableDAO.getFilePathById(fileId));
if(audioFile.exists()) {
return Response.ok().entity(audioFile).build();
} else {
// ... Return 404 here
}
}
But with above API some files getting played & some are not, Similar to Safari case earlier. But safari problem went away ASAP we changed mime to "audio/mp4" or "video/mp4" both the mime type work.
But for the API /getaudiofile/{fileId} none of the following mime types worked with the javax.ws.rs #Produces annotaion :
audio/mp4
video/mp4
audio/mpeg
audio/m4a
audio/mpeg-4
audio/mp4a
But with audio/mpeg some of the files play but some don't.
What may be the right mime type or can file set the codec or mime info itself while returning from API ?
Or Is there any way to make Android Media Player MIME type aware? Like html tag.
We have streamed file content of mime "audio/mpeg" the media player plays smaller sized files easily. But bigger streamed content fails to play e.g. 10 - 20 MB.
Problem with stream URL the player do not get any mime extension e.g. mp3, or mp4. Hence we want the media player to know in advance what type of the content is being streamed.
Does the MediaPlayer API support setting mime type to player instance prior to playing?
Update
1] This is happening for files with big sizes small files play very well, An recording of few seconds ( 1-50 secs ) play without a problem, An recording file with playback length of more than minute fails to play.
2] When played from authenticated URL playback fails if the same file is on local file system plays flawlessly.
If I understand your question correctly, perhaps you might try
openTypedAssetFile
public AssetFileDescriptor openTypedAssetFile (Uri uri,
String mimeTypeFilter,
Bundle opts)
Called by a client to open a read-only stream containing data of a particular MIME type. This is like
openAssetFile(Uri, String), except the file can only be read-only and
the content provider may perform data conversions to generate data of
the desired type.
Documentation for openTypedAssetFile is here.
Also look at the API for openWTypedAssetFileDescriptor here.
https://github.com/aruld/jersey-streaming
Above project by Arul Dhesiaseelan solved my issue, The way I wrote get file Resource was not efficient, In above GitHub project take look into MediaResource.java.
This is the proper way of streaming media files, Adjust mime-types as per your file.
I set headers to media player as follows:
Map<String, String> headers = new HashMap<>();
headers.put("Authorization", "Basic <BASE64_ENCODED_AUTH_TOKEN>");
headers.put("Accept-Ranges", "bytes");
headers.put("Status", "206");
headers.put("Cache-control", "no-cache");
Uri uri = Uri.parse(url);
mediaPlayer.setDataSource(getContext(), uri, headers);
Now I am able to stream media with authentication.
I'm trying simply to convert a .mov file into .webm using Xuggler, which should work as FFMPEG supports .webm files.
This is my code:
IMediaReader reader = ToolFactory.makeReader("/home/user/vids/2.mov");
reader.addListener(ToolFactory.makeWriter("/home/user/vids/2.webm", reader));
while (reader.readPacket() == null);
System.out.println( "Finished" );
On running this, I get this error:
[main] ERROR org.ffmpeg - [libvorbis # 0x8d7fafe0] Specified sample_fmt is not supported.
[main] WARN com.xuggle.xuggler - Error: could not open codec (../../../../../../../csrc/com/xuggle/xuggler/StreamCoder.cpp:831)
Exception in thread "main" java.lang.RuntimeException: could not open stream com.xuggle.xuggler.IStream#-1921013728[index:1;id:0;streamcoder:com.xuggle.xuggler.IStreamCoder#-1921010088[codec=com.xuggle.xuggler.ICodec#-1921010232[type=CODEC_TYPE_AUDIO;id=CODEC_ID_VORBIS;name=libvorbis;];time base=1/44100;frame rate=0/0;sample rate=44100;channels=1;];framerate:0/0;timebase:1/90000;direction:OUTBOUND;]: Operation not permitted
at com.xuggle.mediatool.MediaWriter.openStream(MediaWriter.java:1192)
at com.xuggle.mediatool.MediaWriter.getStream(MediaWriter.java:1052)
at com.xuggle.mediatool.MediaWriter.encodeAudio(MediaWriter.java:830)
at com.xuggle.mediatool.MediaWriter.onAudioSamples(MediaWriter.java:1441)
at com.xuggle.mediatool.AMediaToolMixin.onAudioSamples(AMediaToolMixin.java:89)
at com.xuggle.mediatool.MediaReader.dispatchAudioSamples(MediaReader.java:628)
at com.xuggle.mediatool.MediaReader.decodeAudio(MediaReader.java:555)
at com.xuggle.mediatool.MediaReader.readPacket(MediaReader.java:469)
at com.mycompany.xugglertest.App.main(App.java:13)
Java Result: 1
Any ideas?
There's a funky thing going on with Xuggler where it doesn't always allow you to set the sample rate of IAudioSamples. You'll need to use an IAudioResampler.
Took me a while to figure this out. This post by Marty helped a lot, though his code is outdated now.
Here's how you fix it.
.
Before encoding
I'm assuming here that audio input has been properly set up, resulting in an IStreamCoder called audioCoder.
After that's done, you are probably initiating an IMediaWriter and adding an audio stream like so:
final IMediaWriter oggWriter = ToolFactory.makeWriter(oggOutputFile);
// Using stream 1 'cause there is also a video stream.
// For an audio only file you should use stream 0.
oggWriter.addAudioStream(1, 1, ICodec.ID.CODEC_ID_VORBIS,
audioCoder.getChannels(), audioCoder.getSampleRate());
Now create an IAudioResampler:
IAudioResampler oggResampler = IAudioResampler.make(audioCoder.getChannels(),
audioCoder.getChannels(),
audioCoder.getSampleRate(),
audioCoder.getSampleRate(),
IAudioSamples.Format.FMT_FLT,
audioCoder.getSampleFormat());
And tell your IMediaWriter to update to its sample format:
// The stream 1 here is consistent with the stream we added earlier.
oggWriter.getContainer().getStream(1).getStreamCoder().
setSampleFormat(IAudioSamples.Format.FMT_FLT);
.
During encoding
You are currently probably initiating an IAudioSamples and filling it with audio data, like so:
IAudioSamples audioSample = IAudioSamples.make(512, audioCoder.getChannels(),
audioCoder.getSampleFormat());
int bytesDecoded = audioCoder.decodeAudio(audioSample, packet, offset);
Now initiate an IAudioSamples for our resampled data:
IAudioSamples vorbisSample = IAudioSamples.make(512, audioCoder.getChannels(),
IAudioSamples.Format.FMT_FLT);
Finally, resample the audio data and write the result:
oggResampler.resample(vorbisSample, audioSample, 0);
oggWriter.encodeAudio(1, vorbisSample);
.
Final thought
Just a hint to get your output files to play well:
If you use audio and video within the same container, then audio and video data packets should be written in such an order that the timestamp of each data packet is higher than that of the previous data packet. So you are almost certainly going to need some kind of buffering mechanism that alternates writing audio and video.
I would like to know if is possible get the image via POST method with a HTTP server implemented in Java (With a simple input file form). I already implemented the Java server but I can only get text files via POST method it's because that the my application only copies the file content to another empty file creating the same file with the same characteristics. This does not work with image file or other files, this can only work with text file.
Anyone know how to implement it with images?
Some coordinates would be of great help!
Thanks in advance!
As far as i know you should create something like it:
Server-side: If you use a servlet that receive data in post you have to get the outputStream from the response. Once you have it it is done because you write the data image on the stream.
For example let's suppose your image is a file stored in the server you could do:
response.setContentLength((int) fileSize);
byte b[] = new byte[1024];
while ( fOutStream.read(b) != -1)
response.getOutputStream().write(b);
fOutStream.close() ;
Where the fOutStream is the source stream (your image).
I'm trying to send an image upload in a Qt server trough the socket and visualize it in a client created using Java. Until now I have only transferred strings to communicate on both sides, and tried different examples for sending images but with no results.
The code I used to transfer the image in qt is:
QImage image;
image.load("../punton.png");
qDebug()<<"Image loaded";
QByteArray ban; // Construct a QByteArray object
QBuffer buffer(&ban); // Construct a QBuffer object using the QbyteArray
image.save(&buffer, "PNG"); // Save the QImage data into the QBuffer
socket->write(ban);
In the other end the code to read in Java is:
BufferedInputStream in = new BufferedInputStream(socket.getInputStream(),1);
File f = new File("C:\\Users\\CLOUDMOTO\\Desktop\\JAVA\\image.png");
System.out.println("Receiving...");
FileOutputStream fout = new FileOutputStream(f);
byte[] by = new byte[1];
for(int len; (len = in.read(by)) > 0;){
fout.write(by, 0, len);
System.out.println("Done!");
}
The process in Java gets stuck until I close the Qt server and after that the file generated is corrupt.
I'll appreciate any help because it's neccessary for me to do this and I'm new to programming with both languages.
Also I've used the following commands that and the receiving process now ends and show a message, but the file is corrupt.
socket->write(ban+"-1");
socket->close(); in qt.
And in java:
System.out.println(by);
String received = new String(by, 0, by.length, "ISO8859_1");
System.out.println(received);
System.out.println("Done!");
You cannot transport file over socket in such simple way. You are not giving the receiver any clue, what number of bytes is coming. Read javadoc for InputStream.read() carefully. Your receiver is in endless loop because it is waiting for next byte until the stream is closed. So you have partially fixed that by calling socket->close() at the sender side. Ideally, you need to write the length of ban into the socket before the buffer, read that length at receiver side and then receive only that amount of bytes. Also flush and close the receiver stream before trying to read the received file.
I have absolutely no idea what you wanted to achieve with socket->write(ban+"-1"). Your logged output starts with %PNG which is correct. I can see there "-1" at the end, which means that you added characters to the image binary file, hence you corrupted it. Why so?
And no, 1x1 PNG does not have size of 1 byte. It does not have even 4 bytes (red,green,blue,alpha). PNG needs some things like header and control checksum. Have a look at the size of the file on filesystem. This is your required by size.
I need to provide a feature where user can download reports in excel/csv format in my web application. Once i made a module in web application which creates excel and then read it and sent to browser. It was working correctly. This time i don't want to generate excel file, as i don't have that level of control over file systems. I guess one way is to generate appropriate code in StringBuffer and set correct contenttype(I am not sure about this approach). Other team also has this feature but they are struggling when data is very large. What is the best way to provide this feature considering size of data could be very huge. Is it possible to send data in chunk without client noticing(except delay in downloading).
One issue i forgot to add is when there is very large data, it also creates problem in server side (cpu utilization and memory consumption). Is it possible that i read fixed amount of records like 500, send it to client, then read another 500 till completed.
You can also generate HTML instead of CSV and still set the content type to Excel. This is nice for colouring and styled text.
You can also use gzip compression when the client accepts that compression. Normally there are standard means, like a servlet filter.
Never a StringBuffer or the better StringBuilder. Better streaming it out. If you do not (cannot) call setContentength, the output goes chunked (without predictive progress).
URL url = new URL("http://localhost:8080/Works/images/address.csv");
response.setHeader("Content-Type", "text/csv");
response.setHeader("Content-disposition", "attachment;filename=myFile.csv");
URLConnection connection = url.openConnection();
InputStream stream = connection.getInputStream();
BufferedOutputStream outs = new BufferedOutputStream(response.getOutputStream());
int len;
byte[] buf = new byte[1024];
while ((len = stream.read(buf)) > 0) {
outs.write(buf, 0, len);
}
outs.close();