Proper way to cache values from remote database in an Android application - java

I'm well aware of how I can communicate with an outside server using Android. Currently I'm using the relatively new AppEngine Connected Android project to do it, and everything works fairly well. The only thing that I'm concerned with is handling downtime for the server (if at all) and internet loss on the client side.
With that in mind, I have the following questions on an implementation:
What is the standard technique for caching values in a SQLite Database for Android while still trying to constantly receive data from the web-application.
How can I be sure that I have the most up-to-date information when that information is available.
How would I wrap up this logic (of determining which one to pull from and whether or not the data is recent) into a ContentProvider?
Is caching the data even that good of an idea? Or should I simply assume that if the user isn't connected to the internet, then the information isn't readily available.

Good news! There's a android construct built just for doing this kind of stuff, its called a SyncAdapter. This is how all the google apps do database syncing. Plus there's a great google IO video all about using it! It's actually one of my favorites. It gives you a nice really high level overview of how to go about keeping remote resources synced with your server using something called REST.

Related

Realtime Application with Nodejs MongoDB and JAVA as a backend

I am working on a web application that uses mongodb as the database. Data is inserted into this db via a Java Application and i want to somehow monitor/understand that the data is inserted from a nodejs application so that i can push some information to the clients via socket.io.
I know this is quite easy when we remove Java part from the equation and carry out the insertion via nodejs. But this is not the case for me; so i need pointers on mongodb - nodejs push kind of a thing..
It would be very nice if the solution remains only with Java, Nodejs and mongodb. But if some other 3rd party framework or technology (like mq) must be included i would be happy to hear that.
I would suggest that you should look for apache kafka connect which provide source connector to keep watch on your mongo db. Confluent
has created mongo connector which provide the above functionality. You can go through the above link for further research. Apache kafka is a messaging queue system.
I'd suggest having the Java App let your front end (node) app know when it has changed something. That leaves the responsibility of knowing what has changed and how with the system making the changes. Watching the DB for changes sounds like a good idea, and will likely work but it's far more likely to cause you issues. Consider:
What data to watch?
What changes do you consider worthy of action?
What happens when you change how your data is updated?
How do you know when a change is atomic?
All these issues are at least mitigated if your front end is simply told what has changed and how, where to get the resource and when it happend.
How you tell your front end is up to you. Simple HTTP calls from Java to your front end is simple, if a little unreliable and unpredictable (load wise). A queue/notification service like Amazon SNS might be a little more robust.

Recommended way to store chat history for a webapp

I'm implementing a web based chat client using Openfire API with Java.
Things are setup and running smoothly, however, I'm a little confused as to where to store chat history.
From what I've observed, Desktop based clients typically store chat history on the client side filesystem using formats such as xml, txt etc.
On the server side, I have the following options:
Using plain text file
Using json (mongodb, hbase etc)
Using database
But I would like to know which is the best of the above options (or any other ones if you can suggest) in terms of speed and performance.
Thanks.
As mentioned in isnot2bad's comment, you can add server-side message archiving through the use of the Openfire Monitoring Plugin. Once you have that setup you can try using the XEP-0126 to fetch archived 1-to-1 chat messages over XMPP.
Openfire Plugins
Unfortunately I have had nothing but trouble when trying to get messages out of the archive using the stanza's defined in XEP-0136. If you look around the OF support forum you will find other people are also running into problems with this plugin. For example, the plugin will not return the list of conversations in correct order, it will not filter the list of conversations or messages by the date specified by the start attribute, etc. To say the least, the plugin could use some work. As a work around, I've left the plugin in place to take care of inserting the messages into the database, but I've written a custom AJAX solution for the retrieval of the archived messages. I just pull them directly out of OF's database and return them in a JSON object to my client side javascript.
XEP-0136 is due for replacement, it's always been overly complicated. XEP-0313 seeks to replace it, but I haven't found any implementations for OF. Good luck.

GAE/GWT server side data inconsistent / not persisting between instances

I'm writing a game app on GAE with GWT/Java and am having a issues with server-side persistent data.
Players are polling using RPC for active games and game states, all being stores on the server. Sometimes client polling fails to find game instances that I know should exist. This only happens when I deploy to google appspot, locally everything is fine.
I understand this could be to do with how appspot is a clouded service and that it can spawn and use a new instance of my servlet at any point, and the existing data is not persisting between instances.
Single games only last a minute or two and data will change rapidly, (multiple times a second) so what is the best way to ensure that RPC calls to different instances will use the same server-side data?
I have had a look at the DataStore API and it seems to be database like storage which i'm guessing will be way too slow for what I need. Also Memcache can be flushed at any point so that's not useful.
What am I missing here?
You have two issues here: persisting data between requests and polling data from clients.
When you have a distributed servlet environment (such as GAE) you can not make request to one instance, save data to memory and expect that data is available on other instances. This is true for GAE and any other servlet environment where you have multiple servers.
So to you need to save data to some shared storage: Datastore is costly, persistent, reliable and slow. Memcache is fast, free, but non-reliable. Usually we use a combination of both. Some libraries even transparently combine both: NDB, objectify.
On GAE there is also a third option to have semi-persisted shared data: backends. Those are always-on instances, where you control startup/shutdown.
Data polling: if you have multiple clients waiting for updates, it's best not to use polling. Polling will make a lot of unnecessary requests (data did not change on server) and there will still be a minimum delay (since you poll at some interval). Instead of polling you use push via Channel API. There are even GWT libs for it: gwt-gae-channel, gwt-channel-api.
Short answer: You did not design your game to run on App Engine.
You sound like you've already answered your own question. You understand that data is not persisted across instances. The two mechanisms for persisting data on the server side are memcache and the datastore, but you also understand the limitations of these. You need to architect your game around this.
If you're not using memcache or the datastore, how are you persisting your data (my best guess is that you aren't actually persisting it). From the vague details, you have not architected your game to be able to run across multiple instances, which is essential for any app running on App Engine. It's a basic design principle that you don't know which instance any HTTP request will hit. You have to rearchitect to use the datastore + memcache.
If you want to use a single server, you can use backends, which behave like single servers that stick around (if you limit it to one instance). Frankly though, because of the cost, you're better off with Amazon or Rackspace if you go this route. You will also have to deal with scaling on your own - ie if a game is running on a particular server instance, you need to build a way such that playing the game consistently hits that instance.
Remember you can deploy GWT applications without GAE, see this explanation:
https://developers.google.com/web-toolkit/doc/latest/DevGuideServerCommunication#DevGuideRPCDeployment
You may want to ask yourself: Will your application ever NEED multiple server instances or GAE-specific features?
If so, then I agree with Peter Knego's reply regarding memcache etc.
If not, then you might be able to work around your problem by choosing a different hosting option (other than GAE). Particularly one that lets you work with just a single instance. You could then indeed simply manage all your game data in server memory, like I understand you have been doing so far.
If this solution suits your purpose, then all you need to do is find a suitable hosting provider. This may well be a cloud-based PaaS offer, provided that they let you put a hard limit (unlike with GAE) on the number of server instances, and that it goes as low as one. For example, Heroku (currently) lets you do that, as far as I understand, and apparently it's suitable for GWT applications, according to this thread:
https://stackoverflow.com/a/8583493/2237986
Note that the above solution involves a bit of fiddling and I don't know your needs well enough to make a strong recommendation. There may be easier and better solutions for what you're trying to do. In particular, have a look at non-cloud-based hosting options and server architectures that are optimized for highly time-critical, real-time multiplayer gaming.
Hope this helps! Keep us posted on your progress.

Ipad application - Access a java based web server

I would like to create a native ipad application that displays data fetched from a webserver. The application should be able to fetch tabular data, schedule things on the webserver and receive alerts
I suppose i could do the following
For fetching tabular data, use a single webservice call (will this work? what should be data interchange format? are there limitations to the data payload?)
For receiving alerts, would a persistent connection strategy work be the best way and are there better alternatives that i can tap into natively?
What remoting mechanisms are supported natively?
I have glassfish/spring setup.
Thanks
Having no idea of the data makes it hard to answer.
A successful method applied by many is the web service method, with simple query when the app loads, or is used, and fall back to show data that was loaded last time it had a connection.
If the data is time sensitive, this is more of a dilema.
You could simply note the last refresh time. If your app will be used primarily in the office, this might suffice.
Having a refresh button is a must.
The only reason to think about a persistent connection is if you want some form of server push. That is, do you need the server to inform the device of updates. Use cases for this are things like "chat".
Otherwise a timer asking for updates from the server is the way to go, since it is SO much easier to develop.
Apples toolbox supplies NSUrlConnection
Your iPad app and your web server would have to be very loosely coupled.
Your question is very broad at the moment. While you go, other question will arise.
One pointer though: You must find an exchange protocol that suits your needs (e. g. JSON) and implement this on both sides. The choice depends on your experience and the data you want to exchange.

peer to peer storage library in java

I am developing a java application that will work peer-to-peer. At any given time, there will be over 5000 clients around the world online. Each of these clients will create small files over time. I would like these files to be distributed amongst all the clients and stored, so that anyone can connect and download the filesdump.
Is there any library that would help with that?
(I'm rephrasing this to make it a little more serious since it's still the only answer)
Try finding a Java bit-torrent library, I just googled and there were a bunch of them--I don't want to list them here because I don't have any personal experience with any of them.
I can tell you that the way it normally operates, bit torrents still require some centralized coordination (The tracker). I'm quite sure that Vues works in a "Trackerless" purely p2p mode so I would look for that specifically when evaluating the libraries.
If this isn't some huge widely-distributed and heavily financed app you are creating I highly recommend looking into some other kind of shared file system like dropbox or even svn, they are not p2p but they are known to work reliably and at least dropbox can work completely in the background, unattended and ignored on any platforms for years without trouble.

Categories

Resources