I have the next class, which is used to transmit RTP via audio or video files in Java.
So far so good.
What I want is to modify the UnicastRtp class to enable the transfer the screen, that is, you can use a medialocator like this:
MediaLocator new ("screen://0,0,1280,800/25")
I searched a lot on Internet and think the solution is to change the way you create and configure the processor.
In order to transmit the contents of the screen, it must be supported in StreamPantalla and DataSourcePantalla classes.
I developed a program that is able to display content in a player from an area of the screen (using the same StreamPantalla and DataSourcePantalla classes ), so I know they work well.
Now what I need is change the UnicastRtp class to be able to configure a processor for transmitting the contents of the screen.
Would appreciate any help or clues.
Thank you very much for the help.
Greetings!
This is my solution:
MediaLocator ml=new MediaLocator("screen://0,0,1280,800/25");
DataSource clone=null;
try {
ds = new DataSourcePantalla();
ds.setLocator(ml);
clone = javax.media.Manager.createCloneableDataSource(ds);
} catch (Exception e) {
System.out.println(e.getMessage());
}
try {
ds.connect();
clone.connect();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Format[] outputFormat=new Format[1];
FileTypeDescriptor outputType = new FileTypeDescriptor(FileTypeDescriptor.RAW_RTP);
outputFormat[0]=new VideoFormat(VideoFormat.JPEG_RTP);
ProcessorModel processorModel = new ProcessorModel(clone, outputFormat, outputType);
// Try to create a processor to handle the input media locator
try {
processor = Manager.createRealizedProcessor(processorModel);
} catch (NoProcessorException npe) {
System.out.println(npe.getMessage());
} catch (IOException ioe) {
System.out.println(ioe.getMessage());
}
catch (CannotRealizeException e) {
// TODO Auto-generated catch block
System.out.println(e.getMessage());
}
boolean result = waitForState(processor, Processor.Configured);
if (result == false){
System.out.println("Error, No se pudo configurar el processor en UnicastRtpPantalla::createMyProcessor");
return false;
}
TrackControl[] tracks = processor.getTrackControls();
// Search through the tracks for a video track
for (int i = 0; i < tracks.length; i++) {
Format format = tracks[i].getFormat();
if (tracks[i].isEnabled() && format instanceof VideoFormat) {
System.out.println("Pista "+i+" de video tiene formato: "+tracks[i].getFormat());
// Found a video track. Try to program it to output JPEG/RTP
// Make sure the sizes are multiple of 8's.
float frameRate = 25;//((VideoFormat) format).getFrameRate();
Dimension size = new Dimension(1280, 800);//((VideoFormat) format).getSize();
int w = (size.width % 8 == 0 ? size.width: (int) (size.width / 8) * 8);
int h = (size.height % 8 == 0 ? size.height: (int) (size.height / 8) * 8);
VideoFormat jpegFormat = new VideoFormat(VideoFormat.JPEG_RTP,
new Dimension(w, h), Format.NOT_SPECIFIED,
Format.byteArray, frameRate);
tracks[i].setFormat(jpegFormat);
System.out.println("Pista "+i+" de video se cambiĆ³ a formato: "+tracks[i].getFormat());
} else
tracks[i].setEnabled(false);
}
// // Set the output content descriptor to RAW_RTP
ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
processor.setContentDescriptor(cd);
ds = processor.getDataOutput();
Regards!
Related
I have a problem with textures in Jogl. I draw a bookshelf and when the shelf is detected as marked the texture should change. Here is my code so far:
Texture book;
if (Library.touchTime != 0 && Library.marked.equals(name)){
long actTime = System.currentTimeMillis();
if (actTime - Library.touchTime <= 2000){
this.book = books_marked;
}
else{
Library.touchTime = 0;
Library.marked = "";
this.book = books;
}
}
book.enable();
book.bind();
//---- front --------------------------------------------------
gl.glBegin(GL.GL_QUADS);
normVector = front.getNorm();
gl.glNormal3f(normVector.getX(), normVector.getY(), normVector.getZ());
drawRect(gl, 0, 1, 2, 3);
gl.glEnd();
The Objects books and books_marked are parameters of the constructor and are created like this:
// ---- Load Book Texture -----------------------------------------------
try {
InputStream stream = getClass().getResourceAsStream("books.jpg");
data = TextureIO.newTextureData(stream, false, "jpg");
books = TextureIO.newTexture(data);
} catch (IOException e) {
e.printStackTrace();
System.exit(1);
}
// -------------------------------------------------------------
// ---- Load Book_marked Texture -------------------------------------
try {
InputStream stream = getClass().getResourceAsStream("books_marked.jpg");
data = TextureIO.newTextureData(stream, false, "jpg");
books_marked = TextureIO.newTexture(data);
} catch (IOException e) {
e.printStackTrace();
System.exit(1);
}
// -------------------------------------------------------------------------
My intent was to instanciate the book Texture-Object within the if condition so that the bind() command would automatically bind the correct Picture. But the Texture doesn't change. Has anybody an idea what I got wrong here?
Call Texture.bind(GL) or glBindTexture.
I have been attempting to use Zxing 2.3.0 to read images of UPC barcodes with a +5 supplement in java however i cannot read the supplement portion of the barcode. The code successfully reads the first portion only. After searching multiple websites i cannot find any further indications of how to read the supplement other than my current method. Any help would greatly be appreciated.
public static void main(String[] args) {
decodeUPC5();
}
public static void decodeUPC5(){
InputStream barCodeInputStream = null;
try {
barCodeInputStream = new FileInputStream("C:/Users/apoclyps/git/zxing-barcoder/Zxing-Test/img/upc5.png");
} catch (FileNotFoundException e) {
e.printStackTrace();
}
BufferedImage barCodeBufferedImage = null;
try {
barCodeBufferedImage = ImageIO.read(barCodeInputStream);
} catch (IOException e) {
e.printStackTrace();
}
LuminanceSource source = new BufferedImageLuminanceSource(barCodeBufferedImage);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
// Attempting to read UPC + 5 Supplement
GenericMultipleBarcodeReader multiReader = new GenericMultipleBarcodeReader(new MultiFormatReader());
try {
multiReader.decodeMultiple(bitmap);
} catch (NotFoundException e1) {
e1.printStackTrace();
}
Result[] result = null;
try {
result = multiReader.decodeMultiple(bitmap);
} catch (NotFoundException e) {
e.printStackTrace();
}
System.out.println("Results length "+result.length);
for(Result r : result ){
System.out.println("Barcode text is " + r.toString());
}
}
Barcode image!
Output
Results length 1
Barcode text is 9780735200449
Keep in mind that the content of the barcode is 9780735200449 and not 9780735200449 51299. It will always (correctly) return the 9780735200449 as the contents of the barcode.
The +5 extension is returned as ResultMetadata, under key ResultMetadatatype.UPC_EAN_EXTENSION.
Note that it will still return the UPC barcode even if it doesn't see a +5 extension, obviously. So it's possible you would see it return without a +5 extension on this image. However it works for me with the app and so would imagine it easily detects the +5. (If you scan with the app, look at the left for "Metadata $12.99")
I am using a combination of wireshark's tshark and jnetpcap to decode offline captures and extract the rtp audio payload from files for foward and reverse directions.
In the first steps I isolate only the rtp files and save them to an extra file.
Then I simply loop with jnetpcap through that file and save the rtp payload to a file.
In case I need both channels the produced file can be played, but sampling etc. does not work correctly. It sounds bit too fast (too high). So something must be done differently..
anybody got a hint how to save it into 2 channels so it works as stereo instead of mono?
final StringBuilder errbuf = new StringBuilder();
Pcap pcap = Pcap.openOffline(filename, errbuf);
if(pcap == null) {
System.err.printf("Error while opening device for capture: "
+ errbuf.toString());
return false;
}
PcapPacketHandler<String> handler = new PcapPacketHandler<String>() {
public void nextPacket(PcapPacket packet, String user) {
System.out.println("size of packet is=" + packet.size());
Rtp rtp = new Rtp();
if(packet.hasHeader(rtp)) {
System.out.println("rtp.headerLength = "+rtp.getHeaderLength()+ "rtp.payloadLength = "+rtp.getPayloadLength());
try {
dos.write(rtp.getPayload());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
};
int ret = pcap.dispatch(-1, handler, "I rock");
System.out.println("Ret = "+ret);
try {
dos.close();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
The solution to the question is to build a RingBuffer as jitterbuffer and synchronize packages and do proper silence generation.
Previously I was working with JMF, but JMF need to be installed, but I don't want to add this overhead. That's why I want be moved to FMJ. And FMJ is opensource. :)
There is some sample example given with FMJ source. And there is a FMJStudio, from where I can run and transmit RTP audio captured from microphone.
But when I want to Transmit RTP, using the source below, it couldn't find any capture device.
The complete source can be found on: fmj-20070928-0938_2.zip in FMJ
And the class name of this source class is SimpleVoiceTransmiter.
//final String urlStr = URLUtils.createUrlStr(new File("samplemedia/gulp2.wav"));//"file://samplemedia/gulp2.wav";
Format format;
format = new AudioFormat(AudioFormat.ULAW_RTP, 8000, 8, 1);
//format = new AudioFormat(AudioFormat.ULAW_RTP, 8000.0, 8, 1, AudioFormat.LITTLE_ENDIAN, AudioFormat.SIGNED);
//format = new AudioFormat(BonusAudioFormatEncodings.ALAW_RTP, 8000, 8, 1);
//format = new AudioFormat(BonusAudioFormatEncodings.SPEEX_RTP, 8000, 8, 1, -1, AudioFormat.SIGNED);
//format = new AudioFormat(BonusAudioFormatEncodings.ILBC_RTP, 8000.0, 16, 1, AudioFormat.LITTLE_ENDIAN, AudioFormat.SIGNED);
CaptureDeviceInfo di = null;
//Set to true if you want to transmit audio from capture device, like microphone.
if (true)
{
// First find a capture device that will capture linear audio
// data at 8bit 8Khz
AudioFormat captureFormat = new AudioFormat(AudioFormat.LINEAR, 8000, 8, 1);
Vector devices = CaptureDeviceManager.getDeviceList(captureFormat);
if (devices.size() > 0)
{
di = (CaptureDeviceInfo) devices.elementAt(0);
} else
{
System.err.println("No capture devices");
// exit if we could not find the relevant capturedevice.
System.exit(-1);
}
}
// Create a processor for this capturedevice & exit if we
// cannot create it
Processor processor = null;
try
{
//processor = Manager.createProcessor(new MediaLocator(urlStr));
processor = Manager.createProcessor(di.getLocator());
} catch (IOException e)
{
e.printStackTrace();
System.exit(-1);
} catch (NoProcessorException e)
{
e.printStackTrace();
System.exit(-1);
}
// configure the processor
processor.configure();
while (processor.getState() != Processor.Configured)
{
try
{
Thread.sleep(10);
} catch (InterruptedException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
processor.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
TrackControl track[] = processor.getTrackControls();
boolean encodingOk = false;
// Go through the tracks and try to program one of them to
// output g.711 data.
for (int i = 0; i < track.length; i++)
{
if (!encodingOk && track[i] instanceof FormatControl)
{
if (((FormatControl) track[i]).setFormat(format) == null)
{
track[i].setEnabled(false);
} else
{
encodingOk = true;
}
} else
{
// we could not set this track to g.711, so disable it
track[i].setEnabled(false);
}
}
// At this point, we have determined where we can send out
// g.711 data or not.
// realize the processor
if (encodingOk)
{
if (!new net.sf.fmj.ejmf.toolkit.util.StateWaiter(processor).blockingRealize())
{
System.err.println("Failed to realize");
return;
}
// get the output datasource of the processor and exit
// if we fail
DataSource ds = null;
try
{
ds = processor.getDataOutput();
} catch (NotRealizedError e)
{
e.printStackTrace();
System.exit(-1);
}
// hand this datasource to manager for creating an RTP
// datasink our RTP datasink will multicast the audio
try
{
String url = "rtp://192.168.1.99:49150/audio/1";
MediaLocator m = new MediaLocator(url);
DataSink d = Manager.createDataSink(ds, m);
d.open();
d.start();
System.out.println("Starting processor");
processor.start();
Thread.sleep(30000);
} catch (Exception e)
{
e.printStackTrace();
System.exit(-1);
}
}
When I run this source, The output is: No capture devices
What may be the problem? :-(
Edit: I uninstalled the JMF from my system.
Ok, after two and half days, stuck in the middle of nowhere, I pointed out the problem myself.
The problem was, when I uninstalled JMF it wasn't removed from the CLASSPATH user variable. There was somethinng like:
"C:\PROGRA~1\JMF21~1.1E\lib\sound.jar;C:\PROGRA~1\JMF21~1.1E\lib\jmf.jar;C:\PROGRA~1\JMF21~1.1E\lib;"
and when I removed them, and restarted my computer. Then bingo. The code run without any problem. :)
OK so I have tried using ID3 tags to get the duration and I also tried using JMF media player.getDuration().
player.getDuration().getSeconds()
The file is VBR. Are there any light weight libraries or something inside JMF that could be used to solve this problem.
Thanks.
I use JAudioTagger to achieve this. The below code will get you the duration of an MP3 track.
int duration = 0;
try {
AudioFile audioFile = AudioFileIO.read(new File("file.mp3"));
duration = audioFile.getAudioHeader().getTrackLength();
} catch (Exception e) {
e.printStackTrace();
}
You can alternatively cast audioFile.getAudioHeader() to MP3AudioHeader and use the method getPreciseTrackLength() to get a more precise duration. However, this (i believe) only applies to MP3 files and no other formats (such as WAV files).
I using this lib and this code :
File f = new File("path/mp3");
MediaLocator ml = null;
Player p = null;
try {
ml = new MediaLocator(f.toURL());
p = Manager.createPlayer(ml);
p.start();
while (true) {
Thread.sleep(1000);
System.out.println("Media Time :: "+p.getMediaTime().getSeconds());
System.out.println("Duration :: "+p.getDuration().getSeconds());
if(p.getMediaTime().getSeconds() == p.getDuration().getSeconds())
break;
}
p.stop();
p.deallocate();
p.close();
} catch (NoPlayerException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
Good luck.