Write program using a MIDI device in Java - java

I would like to create a Java program that does simple things to a MIDI device, in this case the Novation Launchpad MK2. It is a grid of 8x8 buttons that can light up. I don't have any experience with using MIDI in Java, and I don't know where to start.
The basic idea is that I want buttons to light up for example when I press them. This means I want to send note and velocity data to the device, but also that pressing a button on the device should send a command to my program. There is documentation on the device on the website of Novation: https://customer.novationmusic.com/support/product-downloads?product=Launchpad.
Is this idea possible, or would it be harder than I imagine it to be? My experience with Java (or programming as a whole) is fairly limited, but I know a good portion of the basics. Anyone an idea on how to do this, how complicated this is and perhaps someone can give me a basic idea how I should go about doing this?

Java have API for your needs. Just learn it.
https://docs.oracle.com/javase/tutorial/sound/accessing-MIDI.html

Related

How do I run an apk on J2ME?

Whats up! I just want to ask, but how do I run an .apk file j2me. I know that this question HAS been asked, but I can't seem to find an answer. Also, i'm not asking to run j2me apps on android; there are already tons of emulators. Im asking for apks running on j2me. I also know that they are developed in different VMs. So, is there any way to do that? Also, another side note, how do i resize the screen of an apk file? Thanks!
-Apersonwithalaptop22
(Edited to be easier to be understood)
You don't. A J2ME environment wouldn't have the Android framework. You'd need to write a complete android framework in the j2me language. It's not as simple as just converting dalvik bytecode to jvm bytecode- it's actually providing every single android class in the SDK. It could be done, but it would take one person a few years to do, and some things would never work quite right. Possibly you can find a project that's tried to start that effort, although I would doubt it- android moves relatively rapidly for them to keep up, and j2me is basically dead.
As for resizing the screen of an APK file- you don't. APK files don't have a screen size. Instead the idea is to write a UI that scales.

Is there a way to create keyboard macros through an android app?

First of all, I am extremely new to coding and I just learned the basics of java. I want to do an android app as my first project. The idea of the app would be to have buttons on your phone which would work as keyboard macros. I was wondering if I also needed to create specific windows drivers for it or if I could just make it run on the default windows keyboard drivers? As I said I am super new to all of this (about 10hrs of programming experience), so feel free to correct me and educate me as much as you want! I won't take it personally, I'm looking to learn :)
You would need to set up some sort of communication between the app and your PC. You wouldn't need to do anything with the keyboard drivers. Java has something called the "Robot class" which allows you to simulate a keypress.
Here is the documentation on the robot class: https://docs.oracle.com/javase/7/docs/api/java/awt/Robot.html
As for the communications, you will need to create a server/client connection. One of your devices will act as the client (probably your phone) and one will act as the server (probably your PC).
This is just a rough idea of how it would work but:
When you tap the button on your phone, it would send some specified data to your server that is running on your PC. You should set the data that gets sent from the client to the server as the key(s) keycode that you would like it to simulate so it will be easier to implement. When the server gets data from the client, it should send that data to the robot.keypress(data) function.
This honestly sounds like it will be a big project for your skill level but I wish you the best of luck on this. This will probably be a frustrating experience but don't let it get the better of you.

Use Java or other languages in a Flutter Application

Since I got no answer and not much feedback on this question: Android Flutter Analyze Audio Waveform
and found nothing online about what I'm looking for, I'll simply ask a broader question, since a comment on that answer told me to use native code and use a platform channel to connect it to flutter but when I asked some clarifications I got nothing.
So my question is If I can do operations in Java (which has been around since a much longer time, and thus has a way bigger documentation), and then use the outcome in Flutter.
More precisely, could I do these things in Java and Flutter:
1) Analyse Audio waveform and find peak points in specific frequencies, and use the timestamp to display them in flutter;
Edit 1:
What Are Peak Points?
This Is the waveform of different frequencies ranges (The Orange one is bass (80-255Hz)), and the Points circled in black are Peak Points. I should analyze the audio specter of a song and Find the peak points in certain frequencies. Then When I Find the Peaks I need to save the timestamps, for example, 16 seconds in and so on.
2) Edit 2:
I need to Edit some photos in a video, like a video collage, for which each frame of a 30 or 60fps video is an image.
3) Edit 3:
I need to add basic frame specific effects to the video, for example a blur that will change frame to frame, or a glare.
4) Adding Music to that video and save it to an mp4 or avi or any format.
5) Edit 4: Most Important thing, I don't want to this all in real time, but more like an After Effect like Render process, in which all the frames are rendered together. The Only thing that would be nice is a sort of progress bar telling the user that the Render is at frame, for example, 200 of 300, but I don't want to display any of the frames or the video, just to render it in background and then save it to an mp4 video that can be viewed after.
As You can see it's a difficult process to do in a language to which you hardly find a tutorial on how to play music due to its early state. But Uis and some other things in flutter are way easier to do and Flutter is also Multi-Platform. So I prefer to stick to Flutter language.
Edit 5:
I took a look at Qt and JUCE, and found out that Qt seems a valid alternative but it seems for what understood more like a "closed" system, I mean, for example I looked the multimedia library but for what I've understood, you can do basic stuff, for example play a video, but not collage frames and save it. (Don't know if I explained myself well). JUCE On the other side, looks better but it seems more for PC audio VST than for mobile applications including video rendering. And another thing is that these two are not free and open source like Flutter is.
Then There is Kivy, which could and could not be the best, because it is a Python port for Mobile Devices and I have a lot of experience with Python And I think it's one of the easier language to learn, but on the other side, it hasn't got that much UI power. and as you mentioned there could be problem using libraries on Android.
You stated I could use C++ or Java With Flutter, but with C++ you told that it's a difficult process. So My question turned out to be Could I write the process in java with a Normal Android Application And Then in some way use the functions in a Flutter App?
Edit 6:
I found a possivle alternative:
Kha (http://kha.tech/). But again found nothing on how to use it with Flutter. Could it be a good Idea?
I'm asking more of a confirmation on if I could use Java or any other language to do what I need in a Flutter Application. And If yes if it's complicated or not that much. (I'm a beginner sorta). But Some tutorial or links to kickstart the code would be helpful aswell!
Flutter at this time is great for building UIs but as you've mentioned, it doesn't have a lot of power or compatibility with libraries yet. A good part of the reason for that is that it doesn't have easy integration with c++, but I won't get into that now.
What you're asking is most likely possible but it's not going to be simple at all to do. First, it sounds like you're wanting to pull particular frames from a video, and display them - that's going to be an additional complication. And don't forget that on a mobile device you have somewhat limited processing power - things will have to be very asynchronous which can actually cause problems for flutter unless you're careful.
As to your points:
This is a very general ask. I'd advise looking up android audio processing libraries. I'm almost sure it's possible, but SO questions are not meant for asking advise on which framework to use. Try https://softwarerecs.stackexchange.com/.
Once again, fairly general and a bit unclear about what you're asking... Try sofwarerecs. I assume you're wanting to take several frames and make them into a video?
Some of those effects (i.e. zoom) you could definitely do with flutter using a Transform. But that would just be while playing in flutter rather then adding to the video files themselves. To do that, you'll have to use the video library in android/java code.
Once again, the video library should do this.
This should also be part of the video library.
I do know of one audio/video library off the top of my head called Processing that may do what you need, but not for sure. It does have an android sdk though. OpenCV would be another but only for video/image processing and I haven't used it directly with Java so I'm not sure how easy it is to use.
For how you'd actually go about implementing this along with flutter... you're going to need to use Platform Channels. I mentioned them in the comment to your other answer but figured you could look that up yourself. The documentation does do a much better job of explaining how that works and how to set it up than I can. But the TLDR is that essentially, what they allow you to do is to send serialized data from native code (java/kotlin/swift etc) to flutter code (dart) and vice-versa, which gets translated into similar data structures in the target language. You can set up various 'channels' upon which the data flows, and within those channels set up 'methods' which get called at either end, or simply send events back and forth.
The complication I mentioned at the beginning is that sending images back and forth across the channels between flutter and dart isn't all that optimal. You most likely won't get a smooth 24/30/60fps of images being sent from java to dart, and it might slow down the rest of the flutter ui significantly. So what you'll want to use for the actual viewport is instead a Texture, which simply displays data from the android side. You'll have to figure out how to write to a texture from android yourself, but there's lots of information available for that. Controls, the visualization of the audio, etc can be done directly in flutter with data that is retrieved from native.
What you'll have is essentially a remote control written in dart/flutter, which sends various commands to a audio/video processing library & wrapper code in Java.
If all that sounds complicated, that's because it is. And as much as flutter is very nice to build UIs in, I have doubts as to whether it's going to be worth the extra complications if you're only targeting android.
Not really related to the answer but rather some friendly advice:
There is one other thing I'll mention - I don't know your level of proficiency with programming and with different languages, but video/audio processing and such are generally not done in java but rather in actual native code (i.e. c/c++). As such, there are actually two levels of abstraction you're going to have to be dealing with here (to some degree as it will probably be abstracted somewhat or a lot depending on the library you're using) - c/c++ to java and java to dart.
You may want to cut out the middlemen and work more directly with native - in that case I'd recommend at least taking a look at Qt or JUCE as they may be more suitable than flutter for your particular use case. There's also Kivy (uses python) which may work well as there's a ton of image/video/audio processing libraries for Python somehow... although they may not all work on android and still have the c++ => python translation to some degree. You'll have to look into licensing etc though - Qt has a broad enough OS licence for most android apps, but JUCE you'd have to pay for unless you're doing open source. I'd have to recommend Qt slightly more than the others as it actually has native decoding of video frames etc, although you'd probably want to incorporate OpenCV or something for the more complicated effects you are talking about. But it would probably be on the same level of complicated as simply writing in java code, but with a slightly different UI style & easier integration with c++ libraries.

Possibly non-intrusive method to change complete android application code with a library

I'm having quite tough problem while developing a testing framework for android apps. The text got a bit long so the actual question is in bold for those that don't want to read the context.
Basically, what I'd like to achieve right now is to trace user activity while he's using the application as one of the features. There's my app that manages context data all the time and developer's app - the one being tested. My idea to do this was to get coordinates where user touched the screen along with taking a screenshot simultaneously. Then I'd use the coordinates to mark the spot on the screenshot to get the idea of what user was doing the whole time with the app. Take hints on user experience and trace crashes.
Non-system apps cannot take a screenshot for security reasons, but application itself can take a screenshot of its Activities without much trouble for non-rooted users, e.g. like here. My only hope here is to interfere with developers' code to implement the functionality of doing so while my testing app is running. Each Activity then would have to extend my overridden Activity instead of regular one, implement an interface, implement broadcast receiver etc.
I am going to write a library for developer who would like his app to be tested with my framework. I'd like it to do the job for me and be as non-intrusive as it's possible for him to use. How to achieve that the best way?
Ideal case would assume linking the library to project with maybe a small addition in manifest that'd get the job done and after just unlinking, removing that bit of xml in manifest for production.
That's an open question. I don't expect any bits of code, but some nifty Java trick, Android OS functionality or even completely other approach that'd solve my problem
I tried to be as clear as possible with the question, but that's a quite tough matter for me to describe so that could have turned out contrary. Don't hesitate to ask me for more details, to speak my mind more clearly or even rewrite the question. Thank you all very much for help!

Building a music player for both android and Desktop at the same time

I'm going to build a music player working on both Android and Desktops. It won't be anything special, I'm doing it more to training myself and know more or less what problems I might encounter if I want to do a real app/program one day. Therefore, since I'm already rather decent at web technologies, I'll try to use something else: Java.
My app / program with have to
be able to read music files and play them (I'm planning on reading the files myself, meaning that I only need to be able to read "raw" sound, WAV or such)
be able to write to music files (to change tags)
be able to communicate with another instance of the program on another device that's on the same network (I want to be able to use my phone as a remote control and my pc as a remote control for my phone)
If possible, show some play/pause buttons on the screen even if it's locked (probably just on android)
And this is where I need your help: What you I do to write as little "device specific" code as possible?
It's obvious I can reuse classes used to encode/decode some music types. Finding the files, reading them, writing them, playing raw sound and connection to the network will be easy to abstract if needed.
But then there is the UI and it looks like if I don't plan carefully, I'll have to do it twice... I've seen libGDX but they kinda insist a lot on the fact it's for games...
All I need is some way to build a simple UI (a few buttons, the cover of the albums) that'd work for both the desktop and the phone.
Should I use libGDX, the "normal" libs (*WT, Swing, neither of which seem to be "compatible" with Android) or something else?
I'd also like to request as few permissions as possible. Meaning that I'd like to have a base music player that only request access to the sd card, and then features requiring additional permissions would be added as other apps/programs or addons.
From what I understood, the only way to achieve this is to create a second app and make the user install it. I think I'll manage to make the two apps communicate (with Intent?) but is it really the only solution?
Thank you in advance for your answers.
Maybe you could consider building the app with something such as Phonegap: http://phonegap.com/ This would let you use your web technologies strength and write a very slim layer of device specific code if any at all!
As for getting a phonegap app to run on the desktop....you could use something like :http://ripple.incubator.apache.org/ to have it run on the desktop. I know this is slightly different and you wanted to tackle writing something in Java - however this is the way mobile development is moving so you may want to get started like this!

Categories

Resources