I have an android project with total size 280 MB. But when I build it, the apk created is just 6 Mb. How is such a compression factor achieved. I have many third party libraries and resources in my project.
The most direct answer is that you are compiling an APK, not compressing it. You are changing the nature of the content, not making the content smaller.
Lots of things happen during a compile that can reduce the size of the pre-compiled content.
Here is an explanation of the compile process for C++ here on SO that is pretty good. APK compilation is different but the ideas are largely the same.
How does the compilation/linking process work?
here is one that is more specific to the APK compile process. I'm not familiar enough with that process to know how accurate the article is but it seems decent.
http://www.alittlemadness.com/2010/06/07/understanding-the-android-build-process/
Related
So I've been following the tutorial found at http://upandcrawling.wordpress.com/2013/08/16/libgdx-and-google-play-game-service-integration/
And I finally got everything in to my application, but I can't compile it in eclipse. I've looked up ways to change the amount of java space, but none really make sense or apply. This is because with libgdx you can't go into the window and preferences and change settings on a class. The classes used to start libgdx just don't appear
2 major questions:
When I try to compile the application it takes quote a few minutes which it never had done before. Is this Normal or should it still only take approximately a minute.
When I do compile it comes with a dex error, so I know it gets through the basic compiling, but why am I getting a java heap error? after all my app isn't like a 1 GB app, I'd reckon it's only about 10 MB, is there a way to increase data space in eclipse and if so how? (Preferably with screenshots, I'm very visual)
An output of your log files would be helpful in troubleshooting.
This is normal (for me anyway) when I compile it could take several minutes to build everything.
Check out these links:
https://wiki.eclipse.org/FAQ_How_do_I_increase_the_heap_size_available_to_Eclipse%3F
Increasing heap space in Eclipse: (java.lang.OutOfMemoryError)
This is in terms of code compilation and nothing else.. :)
So, I am a newbie in my company and predictably got stuck with an awesomely slow computer. And I am having a big problem with my Netbeans running out of memory/resource every time I make a build. I am compiling my JAVA files.
I was using 7.0, and even though I was getting this error, I got by it by compiling the source packages in chunks. (sometimes I had to compile the selected ones more than once)
But ever since I moved to 7.2, this problem is getting worse. I have to now compile the packages in even more smaller chunks. Sometimes package by package and file by file. Hence costing me a lot of time and even lot of hair.
I have no idea which packages to compile first. The netbeans was taking care of that. Therefore, taking resources.
Most of my colleagues have powerful computers and have no problem building the whole source base. So, I started getting the complied packages and only building the required ones.
So, is this the correct approach or building the whole source (even though I just make changes to 1% of the total code base, at any given time)?
Almost everyone in this company is building the whole code base, at least once, even though most of the changes are only in 1%.
It is far better to build the entire project and have it work as designed, then build 99% of it and it doesn't work. There's no indication that the 1% is critical or non-critical code, and as a beginner, you can't tell that just right off the bat.
I would inform your teammates/IT personnel about the slow build and ask what can be done to resolve it, instead of building the code in chunks.
Maybe you should highlight the issues with a developer having a slow machine impeding the work you are doing, when you explain the difference in lost productivity versus hardware cost, you will shortly have a new machine.
Then you can stop worrying about building "99%" and get on to real issues.
It's better to build the entire project. Try tune netbeans.conf
netbeans_default_options="-J-client -J-Xss4m -J-Xms128m -J-XX:PermSize=128m -J-XX:MaxPermSize=512m -J-Dapple.laf.useScreenMenuBar=true -J-Dapple.awt.graphics.UseQuartz=true -J-Dsun.java2d.noddraw=true-J-XX:+UseParNewGC -J-XX:+UseConcMarkSweepGC -J-XX:+CMSClassUnloadingEnabled -J-XX:+CMSPermGenSweepingEnabled"
So, is this the correct approach or building the whole source (even
though I just make changes to 1% of the total code base, at any given
time)?
I think that you can build only some parts of project only if you perfectly know all internal dependencies and can guaranty that no unexpected behaviour in nearby module happens after your modifications were made. It is my opinion. Moreover, you can change code and compile it succesful, but the entire project build can fail the same.
P.S. You should get company to buy you a new computer.
In theory you could walk through all dependencies and make yourself a dependency hierarchy map, and you should only have to compile the code you've changed plus everything that depends on it. However it's not necessarily 100% foolproof and requires A LOT of effort for very little gain. It's not something that I would expect to be a newbies responsibility to sort out, rather your superiors should get you sorted out with some appropriate kit.
After adding Google Guava r09 to our Android project the build time increased significantly, especially the DEX generation phase. I understand that DEX generation takes all our classes + all jars we depend on and translates them to DEX format. Guava is a pretty big jar around 1.1MB
Can it be the cause for the build slowdown?
Are there anything can be done to speed this up?
P.S. Usually I build from Intellij, but I also tried building with Maven - same results.
Thanks
Alex
For what it's worth, my gut is that this isn't the cause. It's hard to take a long time doing anything with a mere 1.1MB of bytecode; I've never noticed dex taking any significant time. But let's assume it is the issue for sake of argument.
If it matters enough, you could probably slice up the Guava .jar to remove whole packages you don't use. It is composed of several pieces that aren't necessarily all inter-related.
I don't think this is going to speed things up, but maybe worth mentioning: if you run the build through Proguard (the optimizer now bundled with the SDK), it can remove unused classes before you get to DEX (and, do a bunch of other great optimization on the byte code). But of course that process probably takes longer itself than dex-ing.
Our system is having a problem with too much files, which is used in a webapp which should be using all the time. That mean the files cannot be deleted and there are too much of them, making the system(which is a windows) slow. We would like zip up the files, and when the file is request, we unzip the particular file out.
I've try the java ZipFile class, and the performance is not good enough, because there will be many people using the webapp and they will request the files. From my observation, the unzipping action require time between 0.5 secs to 2 secs, and when there are too much user, the system cannot catch up to them.
For example, I've use a Jmeter to simulate a situation where 30 user use the system, with a random delay between 0.3 secs to 0.6 secs. Although I doubt there may not be so much requests, I cannot know for advance that how many people will use the webapps. I would like to ask you guys, is there any other method to solve this problem?
Thanks in advance!!
P.S. If any 3rd party library is need, it must be free!
P.S. Because the number of files is just too much, and it hang the machine. We would like do this : zip up 2000 file into a zip file, then the number of files will decrease and hope the system won't hang anymore, and when need, we unzip some file out.
Okay, here's some thoughts. It appears to me that your core problem is the slowness of your system and that you're trying to fix it by compressing the files and decompressing them on demand. Then you've found that the decompression is too slow and you need a faster way to do that.
Now I'm not entirely certain why you think this compression will speed things up instead of making things slower.
I would go back to the original problem and work more on solving that. Why is the number of files making your system slow? If you can figure that out, you can fix it in a way that doesn't involve things going even slower.
If it's an issue with too many files in a directory, think about splitting into multiple directories. But I have no idea whether NTFS even has that problem (FAT did). For example, if you have a directory with files for every minute of the last ten years (five million files), you can split them into day directories (three and a half thousand directories with fifteen hundred files in each).
Compression won't reduce the number of files, just the space taken by them.
If it's an issue with the number of files on the system (rather than in a directory), there are plenty of ways to split files between systems as well. Example, hive off 10% of the entire file set to ten different machines and forward incoming requests for a specific file to the relevant machine.
But, I have to say, I've seen Windows machines handle absolute bucket-loads of files so I'd be very surprised if the problem lay there. I think you're probably just going to have to track down what's actually causing your "hangs".
compressing/uncompressing the files will not make the windows faster.
If zip doesn't provides performance gain (despite has native implementation in Java), you can try to improve at the filesystem-level. Folders with too many (>10000) files doesn't work well under some Windows filesystems, so try to divide the files into several folders, tune the NTFS filesystem (cluster size, reserved space for filesystem), disable anti virus, disable indexing, buy an SSD SLC hard disk...
Just wondering, if it is generally a good idea to compress jar files that will be shipped with a desktop application (no network access to jars), of if the decompression will have a bigger impact than file io.
EDIT: Thanks for the answers so far, and sorry for being a bit unclear here. I was not speaking about shipping the jars to the customer, but of the optimal format for the jar files on the disk when the app start ups. I know that jar files are zip files and can be served with different compression levels (or no compression at all), and I was directly wondering how compression would alter startup performance, not only on my dev box (has a fast SSD disk in it, but also on slower disks).
I expect that the answer depends on your application. However, it should be easy to determine experimentally if compressed JARs give faster or slower startup for your application. Just build your application JAR file with compression on and compression off, and compare the application startup times. (Try it on different machines; e.g. with slow discs, fast discs, SSD, and with different amounts of RAM. Bear in mind that some OSes cache files aggressively, and take this into account in your timing measurements.)
While you are at it, you should also investigate the impact of different compression levels (via the jar command options) and using pack200
Having said that, my gut feeling is that the difference between compressed and uncompressed for locally installed JARs will be small enough that the user will hardly notice the difference.
In almost any reasonable desktop situation, the cost of disk IO is way higher than the cost of compression. It'll almost certainly be a win to compress files.
That said, a JAR file is already compressed. Doubly compressing things is generally not worth the effort. So I'd say no, don't compress your JAR files as they are already compressed.