mediaReader.readPacket() blocks while trying to read rtsp stream in Xuggler - java

I am trying to download a video (with a Xuggler 5.4 library) from rtsp stream to a file using the code below.
String inputSource = "rtsp://[ip-address]:[port]/user=[username]&[password]=password&channel=1&stream=1.sdp";
String outputFilename = "d:/downloadedrtsp.flv";
try {
IContainerFormat inFormat = IContainerFormat.make();
inFormat.setInputFormat("h246");
IMediaReader mediaReader = ToolFactory.makeReader(inputSource);
mediaReader.setQueryMetaData(false);
IMediaWriter mediaWriter = ToolFactory.makeWriter(outputFilename, mediaReader);
mediaReader.addListener(mediaWriter);
logger.info("before reading");
IError error;
while ((error = mediaReader.readPacket()) == null) {
logger.info("reading packet");
}
logger.info("error: " + error.getDescription());
logger.info(error.getType());
logger.info(error.toString());
} catch (Exception e) {
e.printStackTrace();
}
The problem is that after printing "before reading" the code just stop executing, and after a long time it prints me three lines from logger:
error: Unknown error
ERROR_EOF
Unknown error
Stream works great when i am opening it in the VLC media player. I am shure there is some mistake in my mediaReader configuration, but i don't know where exactly as i have a very little experience working with videos. Here is some information about video, taken from VLC:

It seems like everything works as expected.
The error type ERROR_EOF marks the end of the input stream (see the documentation).
The long time you program "stop executing" is the time it takes for Xuggler to convert the video frames (it actually doesn't "stop", just iterate through the while loop).

Related

GStreamer java: AppSink not receiving new_sample signal

I have the task to stream an IP camera's video stream (RTP/RTSP in h264) via J2EE application server to a browser. For this I am using GStreamer 1.21.3 (latest dev release) with the gstreamer-java library on top. We are aiming towards a Websocket solution as the traditional HLS introduces significant latency.
After having figured out what to do with the gst-launch executable on the commandline, I ended up with this code (for the moment):
/*
* Configuration for RTSP over TCP to WebSocket:
* 1. rtspsrc to ip camera
* 2. rtph264depay ! h246parse to extract the h264 content
* 3. mp4mux to create fragmented MP4
* 4. appsink to grab the frames and use them in Websocket server
*/
final String gstPipeline = String.format("rtspsrc onvif-mode=true protocols=tcp user-id=%s user-pw=%s location=%s latency=200"
+ " ! rtph264depay ! h264parse"
+ " ! mp4mux streamable=true fragment-duration=5000"
+ " ! appsink name=sink", USERNAME, PASSWORD, uri);
final Pipeline pipeline = initGStreamerPipeline(gstPipeline);
// Add listener to consume the incoming data
final AppSink sink = (AppSink) pipeline.getElementByName("sink");
sink.setCaps(Caps.anyCaps());
sink.set("emit-signals", true);
sink.set("max-buffers", 50);
sink.connect((NEW_SAMPLE) appsink -> {
final Sample sample = appsink.pullSample();
if (sample == null)
{
return FlowReturn.OK;
}
final Buffer buffer = sample.getBuffer();
try
{
final ByteBuffer buf = buffer.map(false);
LOGGER.debug("Unicast HTTP/TCP message received: {}", new String(Hex.encodeHex(buf, true)));
if (session != null)
{
try
{
buf.flip();
session.getRemote().sendBytes(buf);
}
catch (final Exception e)
{
LOGGER.error("Failed to send data via WebSocket", e);
}
}
}
finally
{
buffer.unmap();
}
return FlowReturn.OK;
});
sink.connect((AppSink.EOS) s -> LOGGER.info("Appsink is EOS"));
sink.connect((AppSink.NEW_PREROLL) s -> {
LOGGER.info("Appsink NEW_PREROLL");
return FlowReturn.OK;
});
LOGGER.info("Connecting to {}", uri);
/**
* Start the pipeline. Attach a bus listener to call Gst.quit on EOS or error.
*/
pipeline.getBus().connect((Bus.ERROR) ((source, code, message) -> {
LOGGER.info(message);
Gst.quit();
}));
pipeline.getBus().connect((Bus.EOS) (source) -> Gst.quit());
pipeline.play();
/**
* Wait until Gst.quit() called.
*/
LOGGER.info("Starting to consume media stream...");
Gst.main();
pipeline.stop();
server.stop();
Now I seem to be stuck here, because the AppSink at the end of the pipeline never gets its new_sample signal triggered. The complete example works like a charme when I replace the appsink with a filesink. I have noticed that there are some other threads (like this one) with similar problems which normally boil down to "you forgot to set emit-signals=true". Any ideas why my appsink gets no data?
Update:
It appears that the problem is the URL I am passing to the pipeline string. It has two query parameters: http://192.168.xx.xx:544/streaming?video=0&meta=1. If I remove the second parameter (and the ambersand along with it), the pipeline works. Unfortunately I found no docs how to escape URLs in the correct way so GStreamer can read it. Can anyone share such documentation?
Update 2:
It starts getting weired now: It looks like the name of the URL parameter is the problem. I have started to replace it with some dummy argument and it works. So the ambersand is not the problem. Then I used VLC media player to consume the stream with the &meta=1 in place which also worked. Is it possible that the string "meta" is treated special in gstreamer?

Decode h264 video to java.awt.image.BufferedImage in java

I am trying to make an AirPlay server in java with this library. I am able to start the server and connect to it and I am getting video input, however the input is in h264 format and I tried decoding it with JCodec but it always says I need an sps/pps and I don't know how to create/find this with just a byte[]. This is the onVideo method which is pretty much just copy-pasted from some websites:
#Override
public void onVideo(byte[] video) {
try {
videoFileChannel.write(ByteBuffer.wrap(video));
ByteBuffer bb = ByteBuffer.wrap(video);
H264Decoder decoder = new H264Decoder();
decoder.addSps(List.of(ByteBuffer.wrap(video)));
Picture out = Picture.create(1920, 1088, ColorSpace.YUV420);
var real = decoder.decodeFrame(bb, out.getData());
// decoder.decodeFrame prints "[WARN] . (:0): Skipping frame as no SPS/PPS have been seen so far..." in console and returns null => NullPointer in next line
var img = AWTUtil.toBufferedImage(real.createCompatible());
// ...
} catch (IOException e) {
e.printStackTrace();
}
}
Edit: I've uploaded a ("working") version to github, but the decoded image is discolored and doesn't update all pixels so when something is on the screen and the frame changes, that something can still be on the image.

How to use OpenNLP parser models in an Android app?

I go through this link for java nlp https://www.tutorialspoint.com/opennlp/index.htm
I tried below code in android:
try {
File file = copyAssets();
// InputStream inputStream = new FileInputStream(file);
ParserModel model = new ParserModel(file);
// Creating a parser
Parser parser = ParserFactory.create(model);
// Parsing the sentence
String sentence = "Tutorialspoint is the largest tutorial library.";
Parse topParses[] = ParserTool.parseLine(sentence, parser,1);
for (Parse p : topParses) {
p.show();
}
} catch (Exception e) {
}
i download file **en-parser-chunking.bin** from internet and placed in assets of android project but code stop on third line i.e ParserModel model = new ParserModel(file); without giving any exception. Need to know how can this work in android? if its not working is there any other support for nlp in android without consuming any services?
The reason the code stalls/breaks at runtime is that you need to use an InputStream instead of a File to load the binary file resource. Most likely, the File instance is null when you "load" it the way as indicated in line 2. In theory, this constructor of ParserModelshould detect this and an IOException should be thrown. Yet, sadly, the JavaDoc of OpenNLP is not precise about this kind of situation and you are not handling this exception properly in the catch block.
Moreover, the code snippet you presented should be improved, so that you know what actually went wrong.
Therefore, loading a POSModel from within an Activity should be done differently. Here is a variant that takes care for both aspects:
AssetManager assetManager = getAssets();
InputStream in = null;
try {
in = assetManager.open("en-parser-chunking.bin");
POSModel posModel;
if(in != null) {
posModel = new POSModel(in);
if(posModel!=null) {
// From here, <posModel> is initialized and you can start playing with it...
// Creating a parser
Parser parser = ParserFactory.create(model);
// Parsing the sentence
String sentence = "Tutorialspoint is the largest tutorial library.";
Parse topParses[] = ParserTool.parseLine(sentence, parser,1);
for (Parse p : topParses) {
p.show();
}
}
else {
// resource file not found - whatever you want to do in this case
Log.w("NLP", "ParserModel could not initialized.");
}
}
else {
// resource file not found - whatever you want to do in this case
Log.w("NLP", "OpenNLP binary model file could not found in assets.");
}
}
catch (Exception ex) {
Log.e("NLP", "message: " + ex.getMessage(), ex);
// proper exception handling here...
}
finally {
if(in!=null) {
in.close();
}
}
This way, you're using an InputStream approach and at the same time you take care for proper exception and resource handling. Moreover, you can now use a Debugger in case something remains unclear with the resource path references of your model files. For reference, see the official JavaDoc of AssetManager#open(String resourceName).
Note well:
Loading OpenNLP's binary resources can consume quite a lot of memory. For this reason, it might be the case that your Android App's request to allocate the needed memory for this operation can or will not be granted by the actual runtime (i.e., smartphone) environment.
Therefore, carefully monitor the amount of requested/required RAM while posModel = new POSModel(in); is invoked.
Hope it helps.

How to properly detect, decode and play a radio stream?

I am currently trying to write a jukebox-like application in Java that is able to play any audio source possible, but encountered some difficulties when trying to play radio streams.
For playback I use JLayer from JavaZoom, that works fine as long as the target is a direct media file or a direct media stream (I can play PCM, MP3 and OGG just fine). However I encounter difficulties when trying to play radio streams which either contain pre-media data like a m3u/pls file (which I could fix by adding a detection beforehand), or data that is streamed on port 80 while a web-page exists at the same location and the media transmitted depends on the type of request. In the later case, whenever I try to stream the media, I instead get the HTML data.
Example link of a stream that is hidden behind a web-page: http://stream.t-n-media.de:8030
This is playable in VLC, but if you put it into a browser or my application you'll receive an HTML file.
Is there:
A ready-made, free solution that I could use in place of JLayer? Preferably open source so I can study it?
A tutorial that can help me to write a solution on my own?
Or can someone give me an example on how to properly detect/request a media stream?
Thanks in advance!
import java.io.*;
import java.net.*;
import javax.sound.sampled.*;
import javax.sound.midi.*;
/**
* This class plays sounds streaming from a URL: it does not have to preload
* the entire sound into memory before playing it. It is a command-line
* application with no gui. It includes code to convert ULAW and ALAW
* audio formats to PCM so they can be played. Use the -m command-line option
* before MIDI files.
*/
public class PlaySoundStream {
// Create a URL from the command-line argument and pass it to the
// right static method depending on the presence of the -m (MIDI) option.
public static void main(String[ ] args) throws Exception {
if (args[0].equals("-m")) streamMidiSequence(new URL(args[1]));
else streamSampledAudio(new URL(args[0]));
// Exit explicitly.
// This is needed because the audio system starts background threads.
System.exit(0);
}
/** Read sampled audio data from the specified URL and play it */
public static void streamSampledAudio(URL url)
throws IOException, UnsupportedAudioFileException,
LineUnavailableException
{
AudioInputStream ain = null; // We read audio data from here
SourceDataLine line = null; // And write it here.
try {
// Get an audio input stream from the URL
ain=AudioSystem.getAudioInputStream(url);
// Get information about the format of the stream
AudioFormat format = ain.getFormat( );
DataLine.Info info=new DataLine.Info(SourceDataLine.class,format);
// If the format is not supported directly (i.e. if it is not PCM
// encoded), then try to transcode it to PCM.
if (!AudioSystem.isLineSupported(info)) {
// This is the PCM format we want to transcode to.
// The parameters here are audio format details that you
// shouldn't need to understand for casual use.
AudioFormat pcm =
new AudioFormat(format.getSampleRate( ), 16,
format.getChannels( ), true, false);
// Get a wrapper stream around the input stream that does the
// transcoding for us.
ain = AudioSystem.getAudioInputStream(pcm, ain);
// Update the format and info variables for the transcoded data
format = ain.getFormat( );
info = new DataLine.Info(SourceDataLine.class, format);
}
// Open the line through which we'll play the streaming audio.
line = (SourceDataLine) AudioSystem.getLine(info);
line.open(format);
// Allocate a buffer for reading from the input stream and writing
// to the line. Make it large enough to hold 4k audio frames.
// Note that the SourceDataLine also has its own internal buffer.
int framesize = format.getFrameSize( );
byte[ ] buffer = new byte[4 * 1024 * framesize]; // the buffer
int numbytes = 0; // how many bytes
// We haven't started the line yet.
boolean started = false;
for(;;) { // We'll exit the loop when we reach the end of stream
// First, read some bytes from the input stream.
int bytesread=ain.read(buffer,numbytes,buffer.length-numbytes);
// If there were no more bytes to read, we're done.
if (bytesread == -1) break;
numbytes += bytesread;
// Now that we've got some audio data to write to the line,
// start the line, so it will play that data as we write it.
if (!started) {
line.start( );
started = true;
}
// We must write bytes to the line in an integer multiple of
// the framesize. So figure out how many bytes we'll write.
int bytestowrite = (numbytes/framesize)*framesize;
// Now write the bytes. The line will buffer them and play
// them. This call will block until all bytes are written.
line.write(buffer, 0, bytestowrite);
// If we didn't have an integer multiple of the frame size,
// then copy the remaining bytes to the start of the buffer.
int remaining = numbytes - bytestowrite;
if (remaining > 0)
System.arraycopy(buffer,bytestowrite,buffer,0,remaining);
numbytes = remaining;
}
// Now block until all buffered sound finishes playing.
line.drain( );
}
finally { // Always relinquish the resources we use
if (line != null) line.close( );
if (ain != null) ain.close( );
}
}
// A MIDI protocol constant that isn't defined by javax.sound.midi
public static final int END_OF_TRACK = 47;
/* MIDI or RMF data from the specified URL and play it */
public static void streamMidiSequence(URL url)
throws IOException, InvalidMidiDataException, MidiUnavailableException
{
Sequencer sequencer=null; // Converts a Sequence to MIDI events
Synthesizer synthesizer=null; // Plays notes in response to MIDI events
try {
// Create, open, and connect a Sequencer and Synthesizer
// They are closed in the finally block at the end of this method.
sequencer = MidiSystem.getSequencer( );
sequencer.open( );
synthesizer = MidiSystem.getSynthesizer( );
synthesizer.open( );
sequencer.getTransmitter( ).setReceiver(synthesizer.getReceiver( ));
// Specify the InputStream to stream the sequence from
sequencer.setSequence(url.openStream( ));
// This is an arbitrary object used with wait and notify to
// prevent the method from returning before the music finishes
final Object lock = new Object( );
// Register a listener to make the method exit when the stream is
// done. See Object.wait( ) and Object.notify( )
sequencer.addMetaEventListener(new MetaEventListener( ) {
public void meta(MetaMessage e) {
if (e.getType( ) == END_OF_TRACK) {
synchronized(lock) {
lock.notify( );
}
}
}
});
// Start playing the music
sequencer.start( );
// Now block until the listener above notifies us that we're done.
synchronized(lock) {
while(sequencer.isRunning( )) {
try { lock.wait( ); } catch(InterruptedException e) { }
}
}
}
finally {
// Always relinquish the sequencer, so others can use it.
if (sequencer != null) sequencer.close( );
if (synthesizer != null) synthesizer.close( );
}
}
}
I have used this piece of code in one of my projects that deal with Audio streaming and was working just fine.
Furthermore, you can see similar examples here:
Java Audio Example
Just reading the javadoc of AudioSystem give me an idea.
There is an other signature for getAudioInputStream: you can give it an InputStream instead of a URL.
So, try to manage to get the input stream by yourself and add the needed headers so that you get the stream instead the html content:
URLConnection uc = url.openConnection();
uc.setRequestProperty("<header name here>", "<header value here>");
InputStream in = uc.getInputStream();
ain=AudioSystem.getAudioInputStream(in);
Hope this help.
I know this answer comes late, but I had the same issue: I wanted to play MP3 and AAC audio and also wanted the user to insert PLS/M3U links. Here is what I did:
First I tried to parse the type by using the simple file name:
import de.webradio.enumerations.FileExtension;
import java.net.URL;
public class FileExtensionParser {
/**
*Parses a file extension
* #param filenameUrl the url
* #return the filename. if filename cannot be determined by file extension, Apache Tika parses by live detection
*/
public FileExtension parseFileExtension(URL filenameUrl) {
String filename = filenameUrl.toString();
if (filename.endsWith(".mp3")) {
return FileExtension.MP3;
} else if (filename.endsWith(".m3u") || filename.endsWith(".m3u8")) {
return FileExtension.M3U;
} else if (filename.endsWith(".aac")) {
return FileExtension.AAC;
} else if(filename.endsWith((".pls"))) {
return FileExtension.PLS;
}
URLTypeParser parser = new URLTypeParser();
return parser.parseByContentDetection(filenameUrl);
}
}
If that fails, I use Apache Tika to do a kind of live detection:
public class URLTypeParser {
/** This class uses Apache Tika to parse an URL using her content
*
* #param url the webstream url
* #return the detected file encoding: MP3, AAC or unsupported
*/
public FileExtension parseByContentDetection(URL url) {
try {
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
InputStream in = connection.getInputStream();
BodyContentHandler handler = new BodyContentHandler();
AudioParser parser = new AudioParser();
Metadata metadata = new Metadata();
parser.parse(in, handler, metadata);
return parseMediaType(metadata);
} catch (IOException e) {
e.printStackTrace();
} catch (TikaException e) {
e.printStackTrace();
} catch (SAXException e) {
e.printStackTrace();
}
return FileExtension.UNSUPPORTED_TYPE;
}
private FileExtension parseMediaType(Metadata metadata) {
String parsedMediaType = metadata.get("encoding");
if (parsedMediaType.equalsIgnoreCase("aac")) {
return FileExtension.AAC;
} else if (parsedMediaType.equalsIgnoreCase("mpeg1l3")) {
return FileExtension.MP3;
}
return FileExtension.UNSUPPORTED_TYPE;
}
}
This will also solve the HTML problem, since the method will return FileExtension.UNSUPPORTED for HTML content.
I combined this classes together with a factory pattern and it works fine. The live detection takes only about two seconds.
I don't think that this will help you anymore but since I struggled almost three weeks I wanted to provide a working answer. You can see the whole project at github: https://github.com/Seppl2202/webradio

Accessing Windows disks directly with Java NIO

I am using a library that uses Java NIO in order to directly map files to memory, but I am having trouble reading disks directly.
I can read the disks directly using FileInputStream with UNC, such as
File disk = new File("\\\\.\\PhysicalDrive0\\");
try (FileInputStream fis = new FileInputStream(disk);
BufferedInputStream bis = new BufferedInputStream(fis)) {
byte[] somebytes = new byte[10];
bis.read(somebytes);
} catch (Exception ex) {
System.out.println("Oh bother");
}
However, I can't extend this to NIO:
File disk = new File("\\\\.\\PhysicalDrive0\\");
Path path = disk.toPath();
try (FileChannel fc = FileChannel.open(path, StandardOpenOption.READ)){
System.out.println("No exceptions! Yay!");
} catch (Exception ex) {
System.out.println("Oh bother");
}
The stacktrace (up to the cause) is:
java.nio.file.FileSystemException: \\.\PhysicalDrive0\: The parameter is incorrect.
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102)
at sun.nio.fs.WindowsFileSystemProvider.newFileChannel(WindowsFileSystemProvider.java:115)
at java.nio.channels.FileChannel.open(FileChannel.java:287)
at java.nio.channels.FileChannel.open(FileChannel.java:334)
at hdreader.HDReader.testcode(HDReader.java:147)
I haven't been able to find a solution, though I saw something close on How to access specific raw data on disk from java. The answer by Daniel Alder suggesting the use of GLOBALROOT seems to be relevant, as the answer uses FileChannel in the answer, but I can't seem to find the drive using this pattern. Is there a way to list all devices under GLOBALROOT or something like that?
At the moment I am looking at replacing uses of NIO with straight InputStreams, but I want to avoid this if I can. Firstly, NIO was used for a reason, and secondly, it runs through a lot of code and will require a lot of work. Finally, I'd like to know how to implement something like Daniel's solution so that I can write to devices or use NIO in the future.
So in summary: how can I access drives directly with Java NIO (not InputStreams), and/or is there a way to list all devices accessible through GLOBALROOT so that I might use Daniel Alser's solution?
Summary of Answers:
I have kept the past edits (below) to avoid confusion. With the help of EJP and Apangin I think I have a workable solution. Something like
private void rafMethod(long posn) {
ByteBuffer buffer = ByteBuffer.allocate(512);
buffer.rewind();
try (RandomAccessFile raf = new RandomAccessFile(disk.getPath(), "r");
SeekableByteChannel sbc = raf.getChannel()) {
sbc.read(buffer);
} catch (Exception ex) {
System.out.println("Oh bother: " + ex);
ex.printStackTrace();
}
return buffer;
}
This will work as long as the posn parameter is a multiple of the sector size (set at 512 in this case). Note that this also works with the Channels.newChannel(FileInputStream), which seems to always return a SeekableByteStream in this case and it appears it is safe to cast it to one.
From quick and dirty testing it appears that these methods truly do seek and don't just skip. I searched for a thousand locations at the start of my drive and it read them. I did the same but added an offset of half of the disk size (to search the back of the disk). I found:
Both methods took almost the same time.
Searching the start or the end of the disk did not affect time.
Reducing the range of the addresses did reduce time.
Sorting the addresses did reduce time, but not by much.
This suggests to me that this is truly seeking and not merely reading and skipping (as a stream tends to). The speed is still terrible at this stage and it makes my hard drive sound like a washing machine, but the code was designed for a quick test and has yet to be made pretty. It may still work fine.
Thanks to both EJP and Apangin for the help. Read more in their respective answers.
Edit:
I have since run my code on a Windows 7 machine (I didn't have one originally), and I get a slightly different exception (see below). This was run with admin privileges, and the first piece of code still works under the same conditions.
java.nio.file.FileSystemException: \\.\PhysicalDrive0\: A device attached to the system is not functioning.
at sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:86)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:97)
at sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:102)
at sun.nio.fs.WindowsFileSystemProvider.newFileChannel(WindowsFileSystemProvider.java:115)
at java.nio.channels.FileChannel.open(FileChannel.java:287)
at java.nio.channels.FileChannel.open(FileChannel.java:335)
at testapp.TestApp.doStuff(TestApp.java:30)
at testapp.TestApp.main(TestApp.java:24)
Edit 2:
In response to EJP, I have tried:
byte[] bytes = new byte[20];
ByteBuffer bb = ByteBuffer.wrap(bytes);
bb.rewind();
File disk = new File("\\\\.\\PhysicalDrive0\\");
try (FileInputStream fis = new FileInputStream(disk);
ReadableByteChannel rbc = Channels.newChannel(new FileInputStream(disk))) {
System.out.println("Channel created");
int read = rbc.read(bb);
System.out.println("Read " + read + " bytes");
System.out.println("No exceptions! Yay!");
} catch (Exception ex) {
System.out.println("Oh bother: " + ex);
}
When I try this I get the following output:
Channel created
Oh bother: java.io.IOException: The parameter is incorrect
So it appears that I can create a FileChannel or ReadableByteChannel, but I can't use it; that is, the error is simply deferred.
When accessing physical drive without buffering, you can read only complete sectors. This means, if a sector size is 512 bytes, you can read only multiple of 512 bytes. Change your buffer length to 512 or 4096 (whatever your sector size is) and FileChannel will work fine:
ByteBuffer buf = ByteBuffer.allocate(512);
try (RandomAccessFile raf = new RandomAccessFile("\\\\.\\PhysicalDrive0", "r");
FileChannel fc = raf.getChannel()) {
fc.read(buf);
System.out.println("It worked! Read bytes: " + buf.position());
} catch (Exception e) {
e.printStackTrace();
}
See Alignment and File Access Requirements.
Your original FileInputStream code works obviously because of BufferedInputStream which has the default buffer size of 8192. Take it away - and the code will fail with the same exception.
Using NIO your original code only needs to change very slightly.
Path disk = Paths.get("d:\\.");
try (ByteChannel bc = Files.newByteChannel(disk, StandardOpenOption.READ)) {
ByteBuffer buffer = ByteBuffer.allocate(10);
bc.read(buffer);
} catch (Exception e){
e.printStackTrace();
}
Is fine, workable code, but I get an access denied error in both your version and mine.
Run this as administrator. It really does work, as it's only a thin wrapper over java.io:
try (FileInputStream fis = new FileInputStream(disk);
ReadableByteChannel fc = Channels.newChannel(fis))
{
System.out.println("No exceptions! Yay!");
ByteBuffer bb = ByteBuffer.allocate(4096);
int count = fc.read(bb);
System.out.println("read count="+count);
}
catch (Exception ex)
{
System.out.println("Oh bother: "+ex);
ex.printStackTrace();
}
EDIT If you need random access, you're stuck with RandomAccessFile. There's no mapping from that via Channels. But the solution above isn't NIO anyway, just a Java NIO layer over FileInput/OutputStream.

Categories

Resources