I am making a music player for rasberry pi. There is a hifiberry module connected to it. I am using a jlayer library for playing music. When i run the code on my PC (Ubuntu) it works fine, but when i try to run it on the rasberry, i dont get any error, but there is no sound playing. I tried reinstalling java. It does not work even without the module.
I am using this piece of code:
public class Main {
public static void main(String[] args) {
AdvancedPlayer player = new AdvancedPlayer(new FileInputStream(args[0]));
player.play();
}
}
Is there any solution for this? or can u suggest any library that could work, which supports mp3 files?
Somehow the audio was streaming to bad output, and i could not find a way to change the output port, so i used a mp3spi library, which i managed to get working.
Related
I'm new in libGDX and if I play a Sound, I have a micro stutter/lag.
My File has the ".wav" extension. - I already tried:
change the file-extensions
make the file-duration longer
I appreciate your help! :))
have a nice day
Recommend setting up an asset manager that loads the sound once before it is needed. This will allow for reuse and encapsulating the process of loading assets for all or parts of an application. Recommend strongly against recreating (reloading) the sound each time. Wave or ogg both work well. Some articles / posts recommend changing the size.
Create object or decode Sound file at once inside create() or in show() method and play that sound whenever you required.
private Sound hit;
#Override
public void create() {
hit = Gdx.audio.newSound(Gdx.files.internal("sfx_hit.wav"));
}
public void playSound(){
hit.play(0.5f);
}
#Override
public void dispose() {
hit.dispose(); // <- only dispose when you're no using further
}
Possible reason : Decoding a compressed file takes times so avoid to decode file each time when you want to play sound and Sample rate of your clip should be lower for fast processing.
I am using IP Webcam APP for android and it is streaming MJPEG video through the local url :
http://192.168.0.2:8080/video
I was able to show the video using VLC player and this piece of code in C++.
On the OpenCV 2.2 I opened the url using:
VideoCapture cap;
cap.open("http://192.168.0.2:8080/video?dummy=param.mjpg");
It worked in C++, but I want it to work in Java. I was able to run OpenCV2.4.9 using Java when taking pictures from my built in webcam. This is my code for taking the images from a url in Java.
System.loadLibrary("opencv_java249");
VideoCapture capture = new VideoCapture();
capture.open("http://192.168.0.2:8080/video?dummy=param.mjpg");
But the capture.open does not open the streaming and I could not debug it properly. I know that it might be a issue with the ffmpeg, since it works on OpenCV2.2. I also know that my OpenCV2.2 is specific for MS 2010 and might be more complete.
Would it help if I compile the OpenCV2.4.9 from sources? Is there a file that I could add to solve that problem? Is there another way of receiving the video from the IP camera and using on OpenCV?
I took a while to figure it out. I could not receive the stream directly from OpenCVJava.I downloaded
http://www.mediafire.com/download/ayxwnwnqv3mpg39/javacv-0.7-bin.zip http://www.mediafire.com/download/2rkk0rjwxov7ale/javacv-0.7-cppjars.zip
Which I believe to be a java wrapper into OpenCV in C. I took this link from this video.
htttp://www.youtube.com/watch?v=mIYaHCyZICI
After unziping the zip I added the jars into my project and Used this code:
package javaapplication7;
import java.io.IOException;
import com.googlecode.javacv.OpenCVFrameGrabber;
import com.googlecode.javacv.CanvasFrame;
import com.googlecode.javacv.cpp.opencv_core.IplImage;
public class JavaApplication7 {
public static void main(String[] args) throws Exception {
OpenCVFrameGrabber grabber = new OpenCVFrameGrabber("http://192.168.0.2:8080/video?dummy=param.mjpg");
grabber.setFormat("mjpeg");
grabber.start();
for (int k=0; k<20000; k++){
System.out.print(k);
}
IplImage frame = grabber.grab();
CanvasFrame canvasFrame = new CanvasFrame("Camera");
canvasFrame.setCanvasSize(frame.width(), frame.height());
while (canvasFrame.isVisible() && (frame = grabber.grab()) != null) {
canvasFrame.showImage(frame);
}
grabber.stop();
canvasFrame.dispose();
System.exit(0);
}
}
Which I got from:
htttp://stackoverflow.com/questions/14251290/cvcreatefilecapture-error-could-not-create-camera-capture-with-javacv
It takes 15-20 seconds to start catching the streaming. But I was impressed with the delay which is much smaller than VLC. It is 1-2 seconds comparing to 3-4 seconds on VLC. I would like to upvote the guy who I took the answer from but I dont have enough reputation/
I also bumped into the same problem as you but the easiest method i figured out was to use droid cam instead of the Ip webcam app.Check it out here
Hi I am programming Java on Windows and am very new to working with MIDI interfaces.
I have managed to get java to play midi sounds through Synthesizer objects, natively through the computers speaker however I wish to send midi messages on the fly to a separate synthesis application, namely FLStudio. I think I have to make the java interface look like a hardware midi device but I have no idea how to do this. I also think it may have something to do with Transmitter or MidiDevice but i'm not sure.
Does anyone know how I would begin to go about this. I have looked all over Google about this but always end up at the same 2 documents,
http://www.jsresources.org/faq_midi.html
and
http://www.ibm.com/developerworks/library/it/it-0801art38/
Sorry if this question has been asked before but i couldn't find it.
Here's what I have so far. Any help would be greatly appreciated.
import javax.sound.midi.*;
public class Midi
{
public static final void main(String args[]) throws Exception
{
//create and open synthesizer
Synthesizer syn = MidiSystem.getSynthesizer();
syn.open();
//open midi channels (we'll use channel 5)
final MidiChannel[] mc = syn.getChannels();
//set instruments
Instrument[] instr = syn.getDefaultSoundbank().getInstruments();
//Possible ways to send midi to FLStudio, rather than inbuilt
//javax.sound.midi.Transmitter?
//javax.sound.midi.MidiDevice?
// change instrument, using midi codes
mc[5].programChange(instr[0].getPatch().getProgram());
// Play note
mc[5].noteOn(50,1000); //(noteNumber, velocity)
}
}
You can use a program like MidiOx to create a virtual MIDI endpoint which you can send MIDI messages to. Then, in your sequencer, you just tell it to accept MIDI messages from the output of that device, and you can use it as a passthru pipe.
So on my machine I have Bluetooth working fine, I can stream audio to it and record from it, except when I run a Java program that has sound. The sound files work through regular speakers but they don't get forwarded to the headset. My current operating System is Lubuntu 10.04.
My code to play a sound is:
public static void playSound(File sound) {
try {
AudioClip cp = Applet.newAudioClip(sound.toURL());
cp.play();
} catch (MalformedURLException ex) {
ex.printStackTrace();
}
}
The Applet.newAudioClip() method is pretty darn old. Like Java 1.0 old. Since then Java has rewritten a lot of it's sound APIs. I bet whatever code is playing that sound doesn't take into account the various audio settings of the OS. The javax.sound.sampled package has the new APIs, and while they are harder to learn, they give you much more control over how the sound is played and modified.
http://download.oracle.com/javase/tutorial/sound/sampled-overview.html
You could test out to see if Java can play that audio over your bluetooth by downloading
http://www.javazoom.net/index.shtml
And try playing an MP3 see if that goes over your bluetooth headset.
I'm experimenting with JavaFX making a small game.
I want to add sound.
How?
I tried MediaPlayer with media defined with relative source attribute like:
attribute media = Media{
source: "{__FILE__}/sound/hormpipe.mp3"
}
attribute player = MediaPlayer{
autoPlay:true
media:media
}
It doesn't play.
I get
FX Media Object caught Exception com.sun.media.jmc.MediaUnavailableException: Media unavailable: file: ... Sound.class/sound/hormpipe.mp3
Just a guess, but is that file "hornpipe.mp3" and not "hormpipe.mp3" (with an m)?
var player = javafx.scene.media.MediaPlayer {
repeatCount: javafx.scene.media.MediaPlayer.REPEAT_FOREVER
media: Media { source: "{\_\_DIR\_\_}clip.wav"
};
};
player.play();
You have to incluye the audio file in the build/compiled directory so Netbeans can pack it into the jar file.
Just a guess, but I think your {__FILE__} will expand to the name of your file. Try replacing it with {__DIR__}.
Also note that {__DIR__} includes the trailing /, so try this instead:
attribute media = Media{
source: "{__DIR__}sound/hormpipe.mp3"}
EDIT: I did some digging, and apparently, the source of a Media object has to be either a remote URL, or an absolute file path, since media files aren't allowed in JARs (something I hope gets changed with future releases, since I really like JavaFX and want to be able to make desktop apps with it). See: JavaFX FAQs.
This worked for me:
MediaPlayer audio = new MediaPlayer(
new Media(
new File("file.mp3").toURI().toString()));
Source file should be in project's root directory (not src, not dist).
OK, having used this question to get MP3 audio working (kinda), I've learned the following (not much).
1) Audio for compressed formats is very platform dependent. My continually upgraded Mint 17.1->18 machine plays mp3 fine using Media and MediaPlayer. Fresh installs of Mint 18 won't (with the dev tools).
So use .wav files.
Media sound=new Media(new File("noises/roll.wav").toURI().toString());
MediaPlayer mediaPlayer=new MediaPlayer(sound);
mediaPlayer.play();
2) One of the things you need to be aware of with Media/MediaPlayer is that in order to play multiple times (repeatedly or all at once ie, on a button press/whatever in a game) you have to spawn N number of MediaPlayer objects, and each one will play once and then stop.
So use javafx.scene.media.AudioClip
AudioClip soundMyNoise = new AudioClip(new File("noises/roll.wav").toURI().toString());
soundMyNoise.play();
AudioClip also has its issues, which include storing the raw audio data in RAM all at once instead of buffering. So there is the possibility of excessive memory use.
No matter which method you end up going with, one thing to be critically aware of was mentioned by daevon earlier - the path issue. With NetBeans, you have NetBeansProjects/yourproject/src/yourproject/foo.java. The sounds in the example above go in NetBeansProjects/yourproject/noises/roll.wav