Android 6.0 (Marshmallow): How to play midi notes? - java

I'm creating an app that generates live instrument sounds and I'm planning on using the new Midi API featured in Android Marshmallow (version 6.0). I've read the package overview document here http://developer.android.com/reference/android/media/midi/package-summary.html and I know how to generate Midi notes but i'm still unsure: how do I actually play these notes after I've generated their Midi data?
Do I need a synthesizer program to play Midi notes? If so, do I have to make my own or is one provided by Android or a 3rd party?
I am a novice with Midi so please be as descriptive as possible with your answer.
What i've tried so far:
I've created a Midi manager object and opened an input port
MidiManager m = (MidiManager)context.getSystemService(Context.MIDI_SERVICE);
MidiInputPort inputPort = device.openInputPort(index);
Then, i've sent a test noteOn midi message to the port
byte[] buffer = new byte[32];
int numBytes = 0;
int channel = 3; // MIDI channels 1-16 are encoded as 0-15.
buffer[numBytes++] = (byte)(0x90 + (channel - 1)); // note on
buffer[numBytes++] = (byte)60; // pitch is middle C
buffer[numBytes++] = (byte)127; // max velocity
int offset = 0;
// post is non-blocking
inputPort.send(buffer, offset, numBytes);
I've also set up a class to receive the midi note messages
class MyReceiver extends MidiReceiver {
public void onSend(byte[] data, int offset,
int count, long timestamp) throws IOException {
// parse MIDI or whatever
}
}
MidiOutputPort outputPort = device.openOutputPort(index);
outputPort.connect(new MyReceiver());
Now, here's where i'm most confused. The use case of my app is to be an all-in-one composition & playback tool for making music. In other words, my app needs to contain or use a virtual midi device (like an intent of another app's midi synthesizer). Unless someone already made such a synthesizer, I must create one myself within my app's lifecycle. How do I actually actually convert a received midi noteOn() into sound coming out of my speakers? I'm especially confused because there also has to be a way to programmatically decide what type of instrument the note sounds like it's coming from: is this also done in a synthesizer?
Midi support in Android Marshmallow is fairly new so I haven't been able to find any tutorials or sample synthesizer apps online. Any insight is appreciated.

I haven't found any "official" way to control the internal synthesizer from Java code.
Probably the easiest option is to use the Android midi driver for the Sonivox synthesizer.
Get it as an AAR package (unzip the *.zip) and store the *.aar file somewhere in your workspace. The path doesn't really matter and it doesn't need to be inside your own app's folder structure but the "libs" folder inside your project could be a logical place.
With your Android project open in Android Studio:
File -> New -> New Module -> Import .JAR/.AAR Package -> Next -> Find
and select the "MidiDriver-all-release.aar" and change the subproject
name if you want. -> Finish
Wait for Gradle to do it's magic and then go to your "app" module's settings (your own app project's settings) to the "Dependencies" tab and add (with the green "+" sign) the MIDI Driver as a module dependency. Now you have access to the MIDI Driver:
import org.billthefarmer.mididriver.MidiDriver;
...
MidiDriver midiDriver = new MidiDriver();
Without having to worry anything about NDK and C++ you have these Java methods available:
// Not really necessary. Receives a callback when/if start() has succeeded.
midiDriver.setOnMidiStartListener(listener);
// Starts the driver.
midiDriver.start();
// Receives the driver's config info.
midiDriver.config();
// Stops the driver.
midiDriver.stop();
// Just calls write().
midiDriver.queueEvent(event);
// Sends a MIDI event to the synthesizer.
midiDriver.write(event);
A very basic "proof of concept" for playing and stopping a note could be something like:
package com.example.miditest;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.view.MotionEvent;
import android.view.View;
import android.widget.Button;
import org.billthefarmer.mididriver.MidiDriver;
public class MainActivity extends AppCompatActivity implements MidiDriver.OnMidiStartListener,
View.OnTouchListener {
private MidiDriver midiDriver;
private byte[] event;
private int[] config;
private Button buttonPlayNote;
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
buttonPlayNote = (Button)findViewById(R.id.buttonPlayNote);
buttonPlayNote.setOnTouchListener(this);
// Instantiate the driver.
midiDriver = new MidiDriver();
// Set the listener.
midiDriver.setOnMidiStartListener(this);
}
#Override
protected void onResume() {
super.onResume();
midiDriver.start();
// Get the configuration.
config = midiDriver.config();
// Print out the details.
Log.d(this.getClass().getName(), "maxVoices: " + config[0]);
Log.d(this.getClass().getName(), "numChannels: " + config[1]);
Log.d(this.getClass().getName(), "sampleRate: " + config[2]);
Log.d(this.getClass().getName(), "mixBufferSize: " + config[3]);
}
#Override
protected void onPause() {
super.onPause();
midiDriver.stop();
}
#Override
public void onMidiStart() {
Log.d(this.getClass().getName(), "onMidiStart()");
}
private void playNote() {
// Construct a note ON message for the middle C at maximum velocity on channel 1:
event = new byte[3];
event[0] = (byte) (0x90 | 0x00); // 0x90 = note On, 0x00 = channel 1
event[1] = (byte) 0x3C; // 0x3C = middle C
event[2] = (byte) 0x7F; // 0x7F = the maximum velocity (127)
// Internally this just calls write() and can be considered obsoleted:
//midiDriver.queueEvent(event);
// Send the MIDI event to the synthesizer.
midiDriver.write(event);
}
private void stopNote() {
// Construct a note OFF message for the middle C at minimum velocity on channel 1:
event = new byte[3];
event[0] = (byte) (0x80 | 0x00); // 0x80 = note Off, 0x00 = channel 1
event[1] = (byte) 0x3C; // 0x3C = middle C
event[2] = (byte) 0x00; // 0x00 = the minimum velocity (0)
// Send the MIDI event to the synthesizer.
midiDriver.write(event);
}
#Override
public boolean onTouch(View v, MotionEvent event) {
Log.d(this.getClass().getName(), "Motion event: " + event);
if (v.getId() == R.id.buttonPlayNote) {
if (event.getAction() == MotionEvent.ACTION_DOWN) {
Log.d(this.getClass().getName(), "MotionEvent.ACTION_DOWN");
playNote();
}
if (event.getAction() == MotionEvent.ACTION_UP) {
Log.d(this.getClass().getName(), "MotionEvent.ACTION_UP");
stopNote();
}
}
return false;
}
}
The layout file just has one button that plays the predefined note when held down and stops it when released:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:paddingBottom="#dimen/activity_vertical_margin"
android:paddingLeft="#dimen/activity_horizontal_margin"
android:paddingRight="#dimen/activity_horizontal_margin"
android:paddingTop="#dimen/activity_vertical_margin"
tools:context="com.example.miditest.MainActivity"
android:orientation="vertical">
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Play a note"
android:id="#+id/buttonPlayNote" />
</LinearLayout>
It is actually this simple. The code above could well be a starting point for a touch piano app with 128 selectable instruments, very decent latency and a proper "note off" functionality which many apps lack.
As for choosing the instrument: You'll just need to send a MIDI "program change" message to the channel on which you intend to play to choose one of the 128 sounds in the General MIDI soundset. But that's related to the details of MIDI and not to the usage of the library.
Likewise you'll probably want to abstract away the low level details of MIDI so that you can easily play a specific note on a specific channel with a specific instrument at a specific velocity for a specific time and for that you might find some clues from all the open source Java and MIDI related applications and libraries made so far.
This approach doesn't require Android 6.0 by the way. And at the moment only 4.6 % of devices visiting the Play Store run Android 6.x so there wouldn't be much audience for your app.
Of course if you want to use the android.media.midi package you could then use the library to implement a android.media.midi.MidiReceiver to receive the MIDI events and play them on the internal synthesizer. Google already has some demo code that plays notes with square and saw waves. Just replace that with the internal synthesizer.
Some other options could be to check out what's the status with porting FluidSynth to Android. I guess there might be something available.
Edit: Other possibly interesting libraries:
port of Java's javax.sound.midi package for abstracting the low level MIDI technical details
USB MIDI Driver for connecting to a digital piano/keyboard with a USB MIDI connector
MIDI over Bluetooth LE driver for connecting wirelessly to a digital piano/keyboard that supports MIDI over Bluetooth LE (like e.g. some recent Roland and Dexibell digital pianos)
JFugue Music library port for Android for further abstracting the MIDI details and instead thinking in terms of music theory

Do I need a synthesizer program to play Midi notes? If so, do I have to make my own or is one provided by Android or a 3rd party?
No, fortunately you don't need to make your own synthesizer. Android already has one built in: the SONiVOX Embedded Audio Syntehesizer. Android states in the docs on SONiVOX JETCreator:
JET works in conjunction with SONiVOX's Embedded Audio Synthesizer (EAS) which is the MIDI playback device for Android.
It wasn't clear whether or not you want real-time playback, or if you want to create a composition first and play it later within the same app. You also state that you want to play midi notes, not files. But, just so you know, Midi playback is supported on android devices. So playing a .mid file should be done the same way you would play a .wav file using MediaPlayer.
To be honest, I haven't use the midi package, or done midi playback, but if you can create a .mid file and save it to disk, then you should be able to play it back using straight MediaPlayer.
Now, if you want to play straight midi notes, not files, then you can use this mididriver package. Using this package you should be able to write midi data to the Embedded Synthesizer:
/**
* Writes midi data to the Sonivox synthesizer.
* The length of the array should be the exact length
* of the message or messages. Returns true on success,
* false on failure.
*/
boolean write(byte buffer[])
If you want to step even lower than that, you could even play straight PCM using AudioTrack.
For additional info, here is a blog post I found from someone who seemed to have similar troubles to yours. He states:
Personally I solved the dynamic midi generation issue as follows: programmatically generate a midi file, write it to the device storage, initiate a mediaplayer with the file and let it play. This is fast enough if you just need to play a dynamic midi sound. I doubt it’s useful for creating user controlled midi stuff like sequencers, but for other cases it’s great.
Hope I covered everything.

To generate sound using Android MIDI API, you need a synthesizer app which accepts MIDI input. Unfortunately, this is the only such app I have found on Google Play:
https://play.google.com/store/apps/details?id=com.mobileer.midisynthexample
I was able to play music by sending note on and note off messages to this app. But program change worked poorly. Unless I did something wrong in my code, it seems the app has only two instruments.
However, there are some guys working on other synthesizer apps, so I expect more apps will be available soon. This app looks promising, though I haven't tested it myself yet:
https://github.com/pedrolcl/android/tree/master/NativeGMSynth

Related

Why isOperational() in mobile vision text Recognizer in a device return true and in other return false?

Why isOperational() in mobile vision text recognizer returns false?
At first, mobile vision only show preview camera and after many tries to get the result, I saw that the texts recognized but in one device it works and in other device does not.
What should I do?
For example, in one device, isOperational() returns false, and it goes to readstate() and after that goes to looper() and stays on it!
in other device it only return false and doesn't go to looper.
I want ask other questions about it:
My first question is: how does isOperational() work? I can't understand it.
Maybe it goes to looper to download the native library in a queue and after many try, at last download completes and work. Can it be correct? Or is it just a bug that it goes to looper? Anywhere, what should I do?
Can I work on this when it works in one device I tried and in other does not? Or it must work in every device to I can work on it? And I get .apk from project but it can't install in devices, why?
Should it check for network?
Should it check for access to the memory?
note: it works with camera API and its deprecated. maybe the problem is with this!
TextRecognizer textRecognizer = new TextRecognizer.Builder(context).build();
textRecognizer.setProcessor(new OcrDetectorProcessor(graphicOverlay));
if (!textRecognizer.**isOperational**()) {
// Note: The first time that an app using a Vision API is installed on a
// device, GMS will download a native libraries to the device in order to do detection.
// Usually this completes before the app is run for the first time. But if that
// download has not yet completed, then the above call will not detect any text,
// barcodes, or faces.
//
// isOperational() can be used to check if the required native libraries are currently
// available. The detectors will automatically become operational once the library
// downloads complete on device.
Log.w(TAG, "Detector dependencies are not yet available.");
// Check for low storage. If there is low storage, the native library will not be
// downloaded, so detection will not become operational.*
IntentFilter lowstorageFilter = new IntentFilter(Intent.ACTION_DEVICE_STORAGE_LOW);
boolean hasLowStorage = registerReceiver(null, lowstorageFilter) != null;
if (hasLowStorage) {
Toast.makeText(this, R.string.low_storage_error, Toast.LENGTH_LONG).show();
Log.w(TAG, getString(R.string.low_storage_error));
}
}
*// Creates and starts the camera. Note that this uses a higher resolution in comparison
// to other detection examples to enable the text recognizer to detect small pieces of text.*
cameraSource =
new CameraSource.Builder(getApplicationContext(), textRecognizer)
.setFacing(CameraSource.CAMERA_FACING_BACK)
.setRequestedPreviewSize(1280, 1024)
.setRequestedFps(2.0f)
.setFlashMode(useFlash ? Camera.Parameters.FLASH_MODE_TORCH : null)
.setFocusMode(autoFocus ? Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO : null)
.build();
}
It doesn't produce any error and show preview camera but doesn't recognize texts in some devices.

Java set MIDI Out receiver

Hey I am trying to send MIDI data from a Java class to a MIDI device connected via USB. I did this once like 2 years ago and it worked, but I somehow can't find the project anymore.
The example Java code runs fine;
myMsg = new ShortMessage();
myMsg.setMessage(ShortMessage.NOTE_ON, 0, 60, 93);
timeStamp = -1;
Receiver rcvr = MidiSystem.getReceiver();
rcvr.send(myMsg, timeStamp);
Simple stuff. 5 lines of code and the message appears on the device. The problem is, that this way, only the standard device is set up and ready to receive MIDI. I can't ask my users to set the device of desire as standard device every time they want to use my application. (the Receiver acts as output destination/port to the input of the physical device I am connected to)
I am now trying to set up the Receiver by doing the following:
MidiDevice.Info[] infoA=MidiSystem.getMidiDeviceInfo();//array where all the device info goes
for (int x=0;x<infoA.length;x++){
System.out.println("in "+infoA[x]); //this displays all the devices
}
MidiSystem.getMidiDevice(infoA[d]); //d is set to an integer that represents the item number on the device list of the device I wanna send MIDI to
System.out.println(infoA[d]);//last check if the correct device is selected
MidiDevice MidiOutDevice = MidiSystem.getMidiDevice(infoA[d]); //setting this to "my" device in order to set the Receiver
maschineReceiver= MidiOutDevice.getReceiver();
Sequencer MidiOutSequencer = MidiSystem.getSequencer();
MidiOutSequencer.getTransmitter().setReceiver(maschineReceiver); //probably unnecessary last 2 line but I gave this a try in case it helps
if I now do maschineReceiver.send(myMsg, timeStamp);, nothing happens at all. I also tried different devices but it didn't get any better. I am sure it can't be a very hard difficult thing to do as it something I actually achieved 2 years ago when my coding skills were awful but I just can't find the mistake right now, no matter how often I reread the Java documentation, it just won't work whatever I do.
Thanks in advance
To actually reserve a device for your program, you need to use the MidiDevice method open:
if (!(device.isOpen())) {
try {
device.open();
} catch (MidiUnavailableException e) {
// Handle or throw exception...
}
}

How to interface Java midi to other applications

Hi I am programming Java on Windows and am very new to working with MIDI interfaces.
I have managed to get java to play midi sounds through Synthesizer objects, natively through the computers speaker however I wish to send midi messages on the fly to a separate synthesis application, namely FLStudio. I think I have to make the java interface look like a hardware midi device but I have no idea how to do this. I also think it may have something to do with Transmitter or MidiDevice but i'm not sure.
Does anyone know how I would begin to go about this. I have looked all over Google about this but always end up at the same 2 documents,
http://www.jsresources.org/faq_midi.html
and
http://www.ibm.com/developerworks/library/it/it-0801art38/
Sorry if this question has been asked before but i couldn't find it.
Here's what I have so far. Any help would be greatly appreciated.
import javax.sound.midi.*;
public class Midi
{
public static final void main(String args[]) throws Exception
{
//create and open synthesizer
Synthesizer syn = MidiSystem.getSynthesizer();
syn.open();
//open midi channels (we'll use channel 5)
final MidiChannel[] mc = syn.getChannels();
//set instruments
Instrument[] instr = syn.getDefaultSoundbank().getInstruments();
//Possible ways to send midi to FLStudio, rather than inbuilt
//javax.sound.midi.Transmitter?
//javax.sound.midi.MidiDevice?
// change instrument, using midi codes
mc[5].programChange(instr[0].getPatch().getProgram());
// Play note
mc[5].noteOn(50,1000); //(noteNumber, velocity)
}
}
You can use a program like MidiOx to create a virtual MIDI endpoint which you can send MIDI messages to. Then, in your sequencer, you just tell it to accept MIDI messages from the output of that device, and you can use it as a passthru pipe.

Android: Playing sound over Sco Bluetooth headset

For the past few days I have been trying to play any sound over my sco bluetooth headset from my android phone. My final goal with this project is to eventually make a garage door opener, but first I need to be able to play sound over the headset.
Here is the basis of the current code I'm using:
==Manifest==
<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
==Code==
audioManager = (AudioManager) getSystemService(AUDIO_SERVICE);
audioManager.setMode(AudioManager.MODE_IN_CALL);
audioManager.startBluetoothSco();
audioManager.setBluetoothScoOn(true);
short[] soundData = new short [8000*20];
for (int iii = 0; iii < 20*8000; iii++) {
soundData[iii] = 32767;
iii++;
soundData[iii] = -32768;
}
audioTrack = new AudioTrack(AudioManager.STREAM_VOICE_CALL,
8000, AudioFormat.CHANNEL_OUT_MONO,
AudioFormat.ENCODING_PCM_16BIT, soundData.length
* Short.SIZE, AudioTrack.MODE_STATIC);
audioTrack.write(soundData, 0, soundData.length);
audioTrack.play();
Before I run this, I pair my bluetooth headset to my phone and have it connected. I have verified it works by calling my voicemail. When I run my code however, no sound comes from anywhere.
Here are the effects of the different lines of code:
When I'm just running my app:
audioManager.setMode(AudioManager.MODE_IN_CALL);
This line makes all of my sound stop working no matter what I do, so it is typically commented out.
audioManager.startBluetoothSco();
audioManager.setBluetoothScoOn(true);
These two lines make the sound stop coming out of the front speaker and make my headset click and hiss like it's turned on but there's no output.
AudioManager.STREAM_VOICE_CALL
This is part of my call to the AudioTrack constructor, but it makes quite a difference. Since this is set to STREAM_VOICE_CALL the sound comes out of the front speaker, if I set this to STREAM_MUSIC, the sound comes out the back speaker instead.
When I open my app during a call:
audioManager.setMode(AudioManager.MODE_IN_CALL);
During a call, this line has no effect because MODE_IN_CALL was already set. But what's different however is that my sound is mixed with the phone call, whereas normally it doesn't play at all.
audioManager.startBluetoothSco();
audioManager.setBluetoothScoOn(true);
These, with their counterpart off halves, control where the audio is coming from. If I turn these off, my sound and the telephone call come from the front speaker, with these on, the telephone call comes from my headset and my sound is lost.
As to why my code is not working, I honestly have no idea. I believe I have fulfilled the checklist for using startBluetoothSco().
Even if a SCO connection is established, the following restrictions
apply on audio output streams so that they can be routed to SCO headset:
- the stream type must be STREAM_VOICE_CALL - the format must be mono -
the sampling must be 16kHz or 8kHz
So, does anyone have an idea of what I am doing wrong? There was one time that I managed to get my sound to play through the headset, but it was only a short tone when I forgot to stop() my AudioTrack so I have to assume it was a glitch.
I found the solution on this page.
You have to call audioManager.setMode(AudioManager.MODE_IN_CALL); only after the socket is connected, i.e., you received AudioManager.SCO_AUDIO_STATE_CONNECTED. I could hear the TTS on my Spica running android 2.2.2.
Edit:
Here is my (old) implementation:
public class BluetoothNotificationReceiver
extends BroadcastReceiver
{
/**
*
*/
public BluetoothNotificationReceiver(Handler h)
{
super();
bnrHandler = h;
}
/* (non-Javadoc)
* #see android.content.BroadcastReceiver#onReceive(android.content.Context, android.content.Intent)
*/
#Override
public void onReceive(Context arg0, Intent arg1)
{
String action = arg1.getAction();
if (action.equalsIgnoreCase(AudioManager.ACTION_SCO_AUDIO_STATE_CHANGED))
{
int l_state = arg1.getIntExtra(AudioManager.EXTRA_SCO_AUDIO_STATE, -1);
Log.d("bnr", "Audio SCO: " + AudioManager.ACTION_SCO_AUDIO_STATE_CHANGED);
switch(l_state)
{
case AudioManager.SCO_AUDIO_STATE_CONNECTED:
{
Log.i("bnr", "SCO_AUDIO_STATE_CONNECTED");
}
break;
case AudioManager.SCO_AUDIO_STATE_DISCONNECTED:
{
Log.e("bnr", "SCO_AUDIO_STATE_DISCONNECTED");
}
break;
default: Log.e("bnr", "unknown state received:"+l_state);
}
}
else
Log.e("bnr", "onReceive:action="+action);
}
}
I don't know if it is still working with the new APIs.
Try setting the audiomanager to AudioManager.MODE_NORMAL. Thos worked for me.
Add to manifest:
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" />
<uses-permission android:name="android.permission.BLUETOOTH" />
I actually managed to fix this. The problem was that my pitch that I played over the headset was too high of a frequency. I fixed it by simply doubling up the code that creates the audio file so that it goes HIGH HIGH LOW LOW instead of HIGH LOW HIGH LOW.

Why doesn't Java play sounds over Bluetooth headset?

So on my machine I have Bluetooth working fine, I can stream audio to it and record from it, except when I run a Java program that has sound. The sound files work through regular speakers but they don't get forwarded to the headset. My current operating System is Lubuntu 10.04.
My code to play a sound is:
public static void playSound(File sound) {
try {
AudioClip cp = Applet.newAudioClip(sound.toURL());
cp.play();
} catch (MalformedURLException ex) {
ex.printStackTrace();
}
}
The Applet.newAudioClip() method is pretty darn old. Like Java 1.0 old. Since then Java has rewritten a lot of it's sound APIs. I bet whatever code is playing that sound doesn't take into account the various audio settings of the OS. The javax.sound.sampled package has the new APIs, and while they are harder to learn, they give you much more control over how the sound is played and modified.
http://download.oracle.com/javase/tutorial/sound/sampled-overview.html
You could test out to see if Java can play that audio over your bluetooth by downloading
http://www.javazoom.net/index.shtml
And try playing an MP3 see if that goes over your bluetooth headset.

Categories

Resources