Android game special requirements - java

I'm making a game for android (inspired by brickbreaker), and am just finishing it up.
Thinking about special android requirements for the game, such as how to handle the various screen sizes, or not needing a back button.
What are other things i should be thinking about?

Have you also considered:
Screen rotation
When the process goes into the background
Checking for memory leaks
Distribution (perhaps in a later stage of development)
Operating system specifics
Good luck with your project!

Related

Use Java or other languages in a Flutter Application

Since I got no answer and not much feedback on this question: Android Flutter Analyze Audio Waveform
and found nothing online about what I'm looking for, I'll simply ask a broader question, since a comment on that answer told me to use native code and use a platform channel to connect it to flutter but when I asked some clarifications I got nothing.
So my question is If I can do operations in Java (which has been around since a much longer time, and thus has a way bigger documentation), and then use the outcome in Flutter.
More precisely, could I do these things in Java and Flutter:
1) Analyse Audio waveform and find peak points in specific frequencies, and use the timestamp to display them in flutter;
Edit 1:
What Are Peak Points?
This Is the waveform of different frequencies ranges (The Orange one is bass (80-255Hz)), and the Points circled in black are Peak Points. I should analyze the audio specter of a song and Find the peak points in certain frequencies. Then When I Find the Peaks I need to save the timestamps, for example, 16 seconds in and so on.
2) Edit 2:
I need to Edit some photos in a video, like a video collage, for which each frame of a 30 or 60fps video is an image.
3) Edit 3:
I need to add basic frame specific effects to the video, for example a blur that will change frame to frame, or a glare.
4) Adding Music to that video and save it to an mp4 or avi or any format.
5) Edit 4: Most Important thing, I don't want to this all in real time, but more like an After Effect like Render process, in which all the frames are rendered together. The Only thing that would be nice is a sort of progress bar telling the user that the Render is at frame, for example, 200 of 300, but I don't want to display any of the frames or the video, just to render it in background and then save it to an mp4 video that can be viewed after.
As You can see it's a difficult process to do in a language to which you hardly find a tutorial on how to play music due to its early state. But Uis and some other things in flutter are way easier to do and Flutter is also Multi-Platform. So I prefer to stick to Flutter language.
Edit 5:
I took a look at Qt and JUCE, and found out that Qt seems a valid alternative but it seems for what understood more like a "closed" system, I mean, for example I looked the multimedia library but for what I've understood, you can do basic stuff, for example play a video, but not collage frames and save it. (Don't know if I explained myself well). JUCE On the other side, looks better but it seems more for PC audio VST than for mobile applications including video rendering. And another thing is that these two are not free and open source like Flutter is.
Then There is Kivy, which could and could not be the best, because it is a Python port for Mobile Devices and I have a lot of experience with Python And I think it's one of the easier language to learn, but on the other side, it hasn't got that much UI power. and as you mentioned there could be problem using libraries on Android.
You stated I could use C++ or Java With Flutter, but with C++ you told that it's a difficult process. So My question turned out to be Could I write the process in java with a Normal Android Application And Then in some way use the functions in a Flutter App?
Edit 6:
I found a possivle alternative:
Kha (http://kha.tech/). But again found nothing on how to use it with Flutter. Could it be a good Idea?
I'm asking more of a confirmation on if I could use Java or any other language to do what I need in a Flutter Application. And If yes if it's complicated or not that much. (I'm a beginner sorta). But Some tutorial or links to kickstart the code would be helpful aswell!
Flutter at this time is great for building UIs but as you've mentioned, it doesn't have a lot of power or compatibility with libraries yet. A good part of the reason for that is that it doesn't have easy integration with c++, but I won't get into that now.
What you're asking is most likely possible but it's not going to be simple at all to do. First, it sounds like you're wanting to pull particular frames from a video, and display them - that's going to be an additional complication. And don't forget that on a mobile device you have somewhat limited processing power - things will have to be very asynchronous which can actually cause problems for flutter unless you're careful.
As to your points:
This is a very general ask. I'd advise looking up android audio processing libraries. I'm almost sure it's possible, but SO questions are not meant for asking advise on which framework to use. Try https://softwarerecs.stackexchange.com/.
Once again, fairly general and a bit unclear about what you're asking... Try sofwarerecs. I assume you're wanting to take several frames and make them into a video?
Some of those effects (i.e. zoom) you could definitely do with flutter using a Transform. But that would just be while playing in flutter rather then adding to the video files themselves. To do that, you'll have to use the video library in android/java code.
Once again, the video library should do this.
This should also be part of the video library.
I do know of one audio/video library off the top of my head called Processing that may do what you need, but not for sure. It does have an android sdk though. OpenCV would be another but only for video/image processing and I haven't used it directly with Java so I'm not sure how easy it is to use.
For how you'd actually go about implementing this along with flutter... you're going to need to use Platform Channels. I mentioned them in the comment to your other answer but figured you could look that up yourself. The documentation does do a much better job of explaining how that works and how to set it up than I can. But the TLDR is that essentially, what they allow you to do is to send serialized data from native code (java/kotlin/swift etc) to flutter code (dart) and vice-versa, which gets translated into similar data structures in the target language. You can set up various 'channels' upon which the data flows, and within those channels set up 'methods' which get called at either end, or simply send events back and forth.
The complication I mentioned at the beginning is that sending images back and forth across the channels between flutter and dart isn't all that optimal. You most likely won't get a smooth 24/30/60fps of images being sent from java to dart, and it might slow down the rest of the flutter ui significantly. So what you'll want to use for the actual viewport is instead a Texture, which simply displays data from the android side. You'll have to figure out how to write to a texture from android yourself, but there's lots of information available for that. Controls, the visualization of the audio, etc can be done directly in flutter with data that is retrieved from native.
What you'll have is essentially a remote control written in dart/flutter, which sends various commands to a audio/video processing library & wrapper code in Java.
If all that sounds complicated, that's because it is. And as much as flutter is very nice to build UIs in, I have doubts as to whether it's going to be worth the extra complications if you're only targeting android.
Not really related to the answer but rather some friendly advice:
There is one other thing I'll mention - I don't know your level of proficiency with programming and with different languages, but video/audio processing and such are generally not done in java but rather in actual native code (i.e. c/c++). As such, there are actually two levels of abstraction you're going to have to be dealing with here (to some degree as it will probably be abstracted somewhat or a lot depending on the library you're using) - c/c++ to java and java to dart.
You may want to cut out the middlemen and work more directly with native - in that case I'd recommend at least taking a look at Qt or JUCE as they may be more suitable than flutter for your particular use case. There's also Kivy (uses python) which may work well as there's a ton of image/video/audio processing libraries for Python somehow... although they may not all work on android and still have the c++ => python translation to some degree. You'll have to look into licensing etc though - Qt has a broad enough OS licence for most android apps, but JUCE you'd have to pay for unless you're doing open source. I'd have to recommend Qt slightly more than the others as it actually has native decoding of video frames etc, although you'd probably want to incorporate OpenCV or something for the more complicated effects you are talking about. But it would probably be on the same level of complicated as simply writing in java code, but with a slightly different UI style & easier integration with c++ libraries.

What's the right approach for creating an Android app?

I have a great idea for an Android app, but as I'm only familiar with php/js, I'm uncertain of which approach I should choose for creating it. The app will be based on a google map with a lot of position markers. There won't be any fancy animations or other heavy resource-demanding activities.
As I see it there are three different options:
Read up on Java and program the whole thing in Java
Create the map activity in Java as a mapview and then use webviews for the other activities (which can easily be scripted as html5 webpages.)
Script everything as a webapp (not really an option, as this is not a real mobile app imho.
I'm most keen on using no. 2 as I'm quite familiar with html/php/js/mysql. Have to read up on the html5 specifics, though. Questions:
I need access to GPS and camera hardware. Is that acheivable in webviews?
How complicated is it to pass variables between js in webview activities and java in other activities?
How big a difference in performance can I expect if I use option 1 vs option 2?
Other thoughts?
Kind regards,
Anders
You can choose number 2, but as we are talking about an android phone, you might want to get really accurate coordinates for your map, and you can only achieve this by accessing your phone GPS, through webviews the best you can get is the location trought the device internet IP adress, wich doesnt lead to a very accurate geo position.
The best choice is a 100% java application in my opinion.
1) Yes it's possible, but as commented it will be less accurate and probably slow.
2) Not complicated. Painful if you need loads of interaction between a webview and native app. Using a Javascript Interface that can be set up from the native app. You can basically inject javascript in a webview's html.
3) Heterogeneity of performance depending on device. Because your implementation will be based on the device's browser you can expect to get really sluggish behavior for older devices. Anything to do with HTML events (Dragging, Tabbing...) will have a knock on most devices, from my experience.
4) As #vodich comments there are other party frameworks. My benchmarking on PhoneGap and other js-based options is that they're a waste of time if you are looking at developing a professional app. I haven't developed on Adobe AIR but find a pain the need to be installing plugins to get native functionality (access to sensors, camera, etc) Mobile is all about fast, responsive behaviour. HDI is your finger, user is fast, so app needs to be fast.
EDIT: So hell yeah! Java FTW!
Albert.
4.Other toughts?
Yes, if you really want to make a great Android app, you should be using only Android and specific Android UI components, and give it a native look and feel. And regarding 1,2 yes it is possible, I would say not so complicated to just integrate them, but I think you'll eventually get in big problems.
Learn Java and write your application natively.
Webviews might allow you to use your php skills to present something to the user, but it's entirely one-way - you'll not be able to interact with what's inside.
The Android developer site offers fantastic documentation and jumping from PHP to Java isn't greatly difficult, though you'll need to get used to strict typing and "real" OOP.
Other thoughts? Don't go down the PhoneGap/Cross platform toolkit road - it might allow you to write applications for multiple platforms and using your current skills, but in the end you get a subpar app that doesn't feel right on either platform and doesn't fair well as future versions of iOS and Android are released.

Is it possible for an Android app to modify the screen at all times?

To give a ridiculous example, could you have a water app/service that stayed on the display AT ALL TIMES (or perhaps only when not actively using another application) and caused ripples to occur when you touched the screen? I would not want an app like this, but it's just an example.
I know there is a live wallpaper for this, but it is in the background. What if you wanted the affect on top of your icons and widgets and UI as well?
Is this possible?
Is this possible?
If you write your own firmware, yes. Android SDK applications cannot do this.

How to synthesize piano sounds in android/java

I have made a few simple apps on android, and thought it was time for something a bit more complex. So, i thought I'd try something that's already out there, but build it from scratch.
The idea is to create an app that allows user to play piano by pressing virtual keys on the display. But I'm not sure how to go about synthesizing the sound of each note, is it best to have copies of of each note stored on file, or is there a more dynamic way of synthesising notes and chords on the fly.
I have worked with C++ so NDK stuff is also okay.
Thanks for any help.
Sound playback (handing off buffers) pretty much has to be done from the Android java apis
Synthesis could be done in native or java, whichever it preferred.
Short (uncompressed) samples could be played back repeatedly, but you probably also want an attack transient. Perhaps you could have an attack, a sustain, and release, repeating the sustain as long as the key is down. Ideally each sample should be an integral number of periods of its fundamental component long so that you don't get a transient when you change between the attack to sustain or sustain to decay.
I'm sure you can find code somewhere for an FM or other synthesizer... this you might well want to implement in a native library that hands off buffers to java code to pass to the audio apis.
What is too bad is that android already has an internal midi synthesizer, but apparently lacks a dynamic interface to it, so it can only play midi files.
By far the easiest solution would be to record the sound of each note on the piano and play it back when the key is pressed. Many professional virtual piano instruments work this way, recording every note on the piano being played at multiple velocities. Obviously this can take many gigabytes of disk space, but for a mobile phone app, you might get away with a single MP3 recording of each note in an octave.
Actually algorithmically synthesizing the sound of a piano is very difficult to do, and until fairly recently, very few have done it convincingly (pianoteq is one of the best current implementations).

Warning Android user that app update could lead to losing data from old app version?

I'm writing a game for Android. When the user completes a level, they can restart from the next level if they lose (i.e. I need to store an integer to remember which level they got to). If the app is interrupted during play, I save the world state to disk (this is complex state storing a map and game entities).
I'd like to keep my options open in the future for changing my game code and the way the world state is saved/stored. However, I must consider the scenario when a user has an old version of the world state on their phone because they were in the middle of a game, they upgrade the app and now the app cannot load the world state.
Having to write code to migrate the old version of the data to the new version of the data would be a pain if there's some way I can avoid this. It would be nice if I could somehow ask the user to finish their current game in progress before updating. Can this be done? Are there any other options?
I don't intend to do this often. I'd like to iteratively develop my game while getting some early feedback, but this is difficult if I must fix how the world state is saved and restored now.
I hope this doesn't seem a silly question, but on a PC or a console it's perfectly OK to have games that you cannot save during a game or you can only save between levels. I'm just finding Android a bit of a pain here as you must have a save game strategy for all games.
You cannot prevent a user from upgrading an app, and you cannot execute any code until your app is installed (or upgraded).
Quite frankly, losing data due to an upgrade is unacceptable. If you use an SQLiteOpenHelper, you automatically get nice hooks that help you with the upgrade process.
I understand that you have a pretty complex savegame setup, but try to keep it as flexible as possible to allow for easy upgrades. There are lots of techniques that help you with that.
And Android and PCs are just completely different - on a PC, you sit down and play for hours. On Android, you play real quick and then do something else. Or, you play and get a phone call and are forced to switch away from your game.

Categories

Resources