SoundTouch on Android? - java

Hey, I am trying to change the pitch of an audio file in an android application.
I have found an open source library online "SoundTouch" (http://www.surina.net/soundtouch), do you think I can use this library in an android app?
I have been Googling "SoundTouch in Java" an have found this data (http://www.aplu.ch/classdoc/jaw/ch/aplu/jaw/SoundTouch.html).
Possible this library I can use or any ideas about any other libraries or process I can use to alter the pitch of an audio file on android? I have also looked into the Java Sound API an android does not support them. :/
Thanks
Adam

Have you thought about just changing the sample rate? If you load the raw audio samples into memory and play them using Android's AudioTrack class, you can specify a wide range of possible sample rates and Android will resample for you. This will change both the pitch and tempo, like playing a record at the wrong speed. If you absolutely need to change the pitch without affecting the tempo, you'll need SoundTouch or something similar.
Anyway, this is definitely possible, but it will take a fair amount of work.
You'll need to use the Android NDK (native development kit) to compile SoundTouch. The Java wrapper you found might be helpful, but ultimately you'll need to get your hands dirty with the NDK.
You'll also need to write your own code to read the audio file from disk, and then buffer it through SoundTouch and out through the AudioTrack class. MediaPlayer won't help you here.
Finally, note that you'll need to abide by the terms of the LGPL when releasing your app.

https://github.com/nonameentername/soundtouch-android
please follow the above link and just build with ndk.
The source of soundtouch is wrong so please download and replace it.
Build on a linux enviornment.
For any queries please get back to me.
If you want to change the pitch of sound you can use openal
http://pielot.org/2011/11/10/openal4android-2/

Related

Use Java or other languages in a Flutter Application

Since I got no answer and not much feedback on this question: Android Flutter Analyze Audio Waveform
and found nothing online about what I'm looking for, I'll simply ask a broader question, since a comment on that answer told me to use native code and use a platform channel to connect it to flutter but when I asked some clarifications I got nothing.
So my question is If I can do operations in Java (which has been around since a much longer time, and thus has a way bigger documentation), and then use the outcome in Flutter.
More precisely, could I do these things in Java and Flutter:
1) Analyse Audio waveform and find peak points in specific frequencies, and use the timestamp to display them in flutter;
Edit 1:
What Are Peak Points?
This Is the waveform of different frequencies ranges (The Orange one is bass (80-255Hz)), and the Points circled in black are Peak Points. I should analyze the audio specter of a song and Find the peak points in certain frequencies. Then When I Find the Peaks I need to save the timestamps, for example, 16 seconds in and so on.
2) Edit 2:
I need to Edit some photos in a video, like a video collage, for which each frame of a 30 or 60fps video is an image.
3) Edit 3:
I need to add basic frame specific effects to the video, for example a blur that will change frame to frame, or a glare.
4) Adding Music to that video and save it to an mp4 or avi or any format.
5) Edit 4: Most Important thing, I don't want to this all in real time, but more like an After Effect like Render process, in which all the frames are rendered together. The Only thing that would be nice is a sort of progress bar telling the user that the Render is at frame, for example, 200 of 300, but I don't want to display any of the frames or the video, just to render it in background and then save it to an mp4 video that can be viewed after.
As You can see it's a difficult process to do in a language to which you hardly find a tutorial on how to play music due to its early state. But Uis and some other things in flutter are way easier to do and Flutter is also Multi-Platform. So I prefer to stick to Flutter language.
Edit 5:
I took a look at Qt and JUCE, and found out that Qt seems a valid alternative but it seems for what understood more like a "closed" system, I mean, for example I looked the multimedia library but for what I've understood, you can do basic stuff, for example play a video, but not collage frames and save it. (Don't know if I explained myself well). JUCE On the other side, looks better but it seems more for PC audio VST than for mobile applications including video rendering. And another thing is that these two are not free and open source like Flutter is.
Then There is Kivy, which could and could not be the best, because it is a Python port for Mobile Devices and I have a lot of experience with Python And I think it's one of the easier language to learn, but on the other side, it hasn't got that much UI power. and as you mentioned there could be problem using libraries on Android.
You stated I could use C++ or Java With Flutter, but with C++ you told that it's a difficult process. So My question turned out to be Could I write the process in java with a Normal Android Application And Then in some way use the functions in a Flutter App?
Edit 6:
I found a possivle alternative:
Kha (http://kha.tech/). But again found nothing on how to use it with Flutter. Could it be a good Idea?
I'm asking more of a confirmation on if I could use Java or any other language to do what I need in a Flutter Application. And If yes if it's complicated or not that much. (I'm a beginner sorta). But Some tutorial or links to kickstart the code would be helpful aswell!
Flutter at this time is great for building UIs but as you've mentioned, it doesn't have a lot of power or compatibility with libraries yet. A good part of the reason for that is that it doesn't have easy integration with c++, but I won't get into that now.
What you're asking is most likely possible but it's not going to be simple at all to do. First, it sounds like you're wanting to pull particular frames from a video, and display them - that's going to be an additional complication. And don't forget that on a mobile device you have somewhat limited processing power - things will have to be very asynchronous which can actually cause problems for flutter unless you're careful.
As to your points:
This is a very general ask. I'd advise looking up android audio processing libraries. I'm almost sure it's possible, but SO questions are not meant for asking advise on which framework to use. Try https://softwarerecs.stackexchange.com/.
Once again, fairly general and a bit unclear about what you're asking... Try sofwarerecs. I assume you're wanting to take several frames and make them into a video?
Some of those effects (i.e. zoom) you could definitely do with flutter using a Transform. But that would just be while playing in flutter rather then adding to the video files themselves. To do that, you'll have to use the video library in android/java code.
Once again, the video library should do this.
This should also be part of the video library.
I do know of one audio/video library off the top of my head called Processing that may do what you need, but not for sure. It does have an android sdk though. OpenCV would be another but only for video/image processing and I haven't used it directly with Java so I'm not sure how easy it is to use.
For how you'd actually go about implementing this along with flutter... you're going to need to use Platform Channels. I mentioned them in the comment to your other answer but figured you could look that up yourself. The documentation does do a much better job of explaining how that works and how to set it up than I can. But the TLDR is that essentially, what they allow you to do is to send serialized data from native code (java/kotlin/swift etc) to flutter code (dart) and vice-versa, which gets translated into similar data structures in the target language. You can set up various 'channels' upon which the data flows, and within those channels set up 'methods' which get called at either end, or simply send events back and forth.
The complication I mentioned at the beginning is that sending images back and forth across the channels between flutter and dart isn't all that optimal. You most likely won't get a smooth 24/30/60fps of images being sent from java to dart, and it might slow down the rest of the flutter ui significantly. So what you'll want to use for the actual viewport is instead a Texture, which simply displays data from the android side. You'll have to figure out how to write to a texture from android yourself, but there's lots of information available for that. Controls, the visualization of the audio, etc can be done directly in flutter with data that is retrieved from native.
What you'll have is essentially a remote control written in dart/flutter, which sends various commands to a audio/video processing library & wrapper code in Java.
If all that sounds complicated, that's because it is. And as much as flutter is very nice to build UIs in, I have doubts as to whether it's going to be worth the extra complications if you're only targeting android.
Not really related to the answer but rather some friendly advice:
There is one other thing I'll mention - I don't know your level of proficiency with programming and with different languages, but video/audio processing and such are generally not done in java but rather in actual native code (i.e. c/c++). As such, there are actually two levels of abstraction you're going to have to be dealing with here (to some degree as it will probably be abstracted somewhat or a lot depending on the library you're using) - c/c++ to java and java to dart.
You may want to cut out the middlemen and work more directly with native - in that case I'd recommend at least taking a look at Qt or JUCE as they may be more suitable than flutter for your particular use case. There's also Kivy (uses python) which may work well as there's a ton of image/video/audio processing libraries for Python somehow... although they may not all work on android and still have the c++ => python translation to some degree. You'll have to look into licensing etc though - Qt has a broad enough OS licence for most android apps, but JUCE you'd have to pay for unless you're doing open source. I'd have to recommend Qt slightly more than the others as it actually has native decoding of video frames etc, although you'd probably want to incorporate OpenCV or something for the more complicated effects you are talking about. But it would probably be on the same level of complicated as simply writing in java code, but with a slightly different UI style & easier integration with c++ libraries.

Streaming audio inside a call sended from code in android

Well, i want to call from my code and when the other persons answered my call send a audio clip in the audio streaming, i was ready is possible but also that isnt, help please.-
So, multiple questions in one. I'll answer all of them, mostly so you can get the keywords you can search for.
Setting up a call programatically, perfectly possible and quite easy
Playing an audio file into the call, not so trivial. Using the Java API for Android, you just can't do it. Anyhow, you can create a C application which can play an audio file, use it in your Android app (NKD and JNI required here), and give it control of the microphone.
If you are comfortable using C and think you can learn how to use NDK and JNI, you can do it. If you're gonna need it to be done using Java... right now you will just not be able to do it.
pd: if you are thinking about playing a file in the speaker and hope it to get into the microphone, it won't work on most of the devices out there. There are quite good echo cancellation chips out there.

How to access adc and dac directly on android phones?

I am at the beginning stages of a project in which I will be trying to make a hearing aid application for Android. I have wrote a few patches in Pure Data,C sound, and the basic Android sound library which basical take the input from the microphone and play it through headphones. No filtering or amplification.
While Csound gave the best performance, the latency made the tools unusable. I know Android L is suppose to help, but my goal is to create a low cost device hearing assist device. So older phones probably won't get it.
The next idea is to see if I can access the adc and dac values directly, then use C to make my own versions of AudioTrack and Audio record by using the NDK. Basically pointing to the places in memory where these values are coming in.
Is this possible? Also what should I be researching? I can't find anything online about accesses the DAC and ADC directly.
Thank you for your time.
No. "Android" does not provide for direct access by apps to hardware at all.
The NDK does not change that, as you still lack permission to the audio hardware device nodes.
If you have a particular device on which you can install a customized build of Android, then you might be able to do something by adding new APIs or somehow giving your app or a special unix group access to the hardware nodes. But the details of how you might utilize that access would depend on the device chosen.

VLC (J) audio pitch control

I'm currently developing simple video player with VLCJ.
Can anyone please point me some clue about changing audio pitch with it?
is it possible?
I've searching through but cannot find the right keyword, what i need is
some control (method/function) of vlcj (or any) to increase sound so it sounds like
kids voice, or lower down so make like a very old man sound.
Thanks in advance.
NOTE:
still looking on google but found nothing about vlc. what i want is something about the "timbre" as explained at http://www.screamingbee.com/support/morphdoc/MorphDocPitchTimbre.aspx
If you are only interested in playing audio (you don't care about displaying any video at the same time) then vlcj 2.4.0 and later provide a so-called "direct" audio player component.
With this component, your Java application can get direct access to the native audio sample buffer. You can run whatever algorithm you want on those samples, then play out your modified samples via JavaSound or some other API.
There is a sample included in the vlcj distribution that shows how to use this component to play via JavaSound:
https://github.com/caprica/vlcj/tree/vlcj-2.4.1/src/test/java/uk/co/caprica/vlcj/test/directaudio
The example does not show how to change the pitch of the audio, but it does show how to use the direct audio player.

Which Java library to use for record a video from a connected camera?

You would have thought that tere is a simple solution to this but there isn't :(
My application needs to capture a stream from a USB/firewire (or whatever is the connection) connected camera (result would be a file like output.flv). I would prefer that I can detect all connected cameras and choose which one to use (one or more at the same time --> one or more output files). Application has to be cross platform.
Found libraries:
Xuggle - not very good camera support. Good for manipulating over images and video.
JMF - an old API but if I can use it, I will. I don't see a MAC OS X link on downloads page.
FMJ - looks like a better version of JMF but a can't find a way of installing it.
LTI-CIVIL - FMJ uses it. It looks like it only captures images from camera (not video). I could use Xuggle to create a video from images taken from LTI-CIVIL. And like FMJ, it is difficult to install.
What are your suggestions on this one?
I'd recommend VLCj for this - it should be able to stream from webcams onto a Java canvas without any difficulties. It uses native code so you need to provide libvlc.so / dll but from there on it should work on all the major platforms (Windows, Mac, Linux).
You may need to look at out of process players for complete reliability which is a bit more complex (see here for my efforts so far) but once you've got that in place it should work fine.
There really is no good camera support for Java. You will have to use native code, tailored for each platform, through JNI to get video capture for your project.
There's a related question here. Basically they're suggesting OpenCV wrapped with JNI.

Categories

Resources