I would like to set arbitrary time in application. Time is downloaded from server in milliseconds format- it should be independent from locale and other system preferences.
But application reacquire thread safety solution, and object like standard not thread safety Calendar object.
Whats is the best way?
Today I use:
Calendar calendar = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
calendar.setTimeInMillis(serverTime);
But is not good way for me beacuse, is't thread safe.
tl;dr
Program have to contain own internal clock fetching time from external server. Clock must be thread-safe.
The time in milliseconds that a Java application uses is
the specified number of milliseconds since the standard base time
known as "the epoch", namely January 1, 1970, 00:00:00 GMT.
This number is based on the GMT time zone. If you need to print it in another time zone you can use any formatting class you want, say SimpleDateFormat. If you need to make the variable that holds it thread safe, just synchronize on it, possibly by wrapping it in a class.
public class TimeInMillis {
private volatile long time;
public void setTime(long time) {this.time = time;}
public time getTime() {return time;}
}
Whenever you need to display it, just get the TimeInMillis object, get the time and create a Calendar object with it. Then use a formatting class to print the time in the format, locale, timezone, you require.
Time is downloaded from server in milliseconds format- it should be independent from locale and other system preferences
That isn't "time". That is a timestamp, meaning a particular time value reported by a particular piece of software at a particular point in time.
Now, if you are trying to say that, in future communications with this server, you want to translate time as reported by the device into the timebase as known by the server, that makes at least a bit of sense. In that case, you compute the delta between the device time when you receive the timestamp and the time value in the timestamp itself. Then, you apply that delta to future times you report back to the server.
Program have to contain own internal clock fetching time from external server
That makes no sense whatsoever.
In this universe, based on our current knowledge of physics, time is continuous and linear. Time does not change only when you are "fetching time from an external server". Again, what you are "fetching... from an external server" is a timestamp, a statement of what the clock on the server thought the time was at the time you made the request. You can use that timestamp for comparison purposes with other timestamps from that server, and you can use that timestamp to compare to the device's current time to determine the difference.
However, you cannot create hardware in Java code, and so you cannot create "internal clock" in Java code. The only clock is the device's clock. As noted earlier, you can translate the device's clock's time to the timebase of the server by applying the aforementioned difference, but the passage of time is still being marked by the device's own clock.
Since the difference is going to be an int or long, you can use AtomicInteger or AtomicLong if you are concerned about multiple threads working with the value at once.
I would just the the time in milli-second with GMT (BTW computers don't support UTC as such)
long serverTime = System.currentTimeMillis(); // millis since 1/1/1970 GMT.
To get/set this in a thread safe manner you can make it volatile
BTW Calender is pretty slow even on a PC. I would avoid using it on a phone.
Related
I want to disregard the Device Time and want to implement my own Clock inside my app. The time I need will be coming from the server.
I have already set its Date and Time as follow:
Date.setDate()
and
Date.setTime()
Now, I just need to get my private Date running like a clock.
How can I do this?
EDIT:
So far, I have not implemented anything as I have no idea on how to create a clock/timer starting on a specific date/time object. I am thinking of having a thread and a runnable for ticking but I don't know how to increment it to make it work just like a clock.
This question already has an answer here:
I want to get current time in Java but without the Internet and without the system time
(1 answer)
Closed 6 years ago.
I am new here:) I am developing a small app in android (java) and I need something to control the seconds elapsed between two events.
The problem with System.currentTimeMillis () is that the user of the device can change his system date for example after the first event, and so when I take the value returned by System.currentTimeMillis () after the second event, and I make the difference between the two values, this obtained difference is not valid at all.
Another option I tried was System.nanoTime (). Although the user changes his system time, the seconds count is valid. But here the problem is that if after the first event, the user switches off the device, the value returned by System.nanoTime() after the second event is not valid because with the device restart, the counter of System.nanoTime() also restarts , and therefore, the elapsed time is again not valid.
Does anybody know any method to count the seconds between two events, considering user date changes and user restarts of the device ?
Thanks in advance!
Since you want to avoid the errors that can be introduced by the user messing up with the system date, you cannot rely on that source for information. A reliable time (and accurate, if matters) can be obtained using the NTP (Network Time Protocol) protocol. Take a look at this question for more details about it: Java NTP client.
An alternative you may consider is, instead of finding a reliable clock to compute the date/time difference, you can check if the user has changed the system clock. A simple way would be to store the timestamp and check periodically (every second of so) if the new timestamp you get from the system clock is smaller (before) the previous one. If so, you can take action.
I have a piece of code that has to be executed at a particular time every day. If I schedule it to be executed at 9PM everyday, then it has to work even during the switching of Day light saving.
Which Java API can be used to achieve this?
int ONE_DAY = 1000 * 60 * 60 * 24;
Timer timer = new Timer();
timer.schedule(myTimerTask, startTime, ONE_DAY); // startTime is 9PM of current day
I've used the above approach which will not take care of DST.
If you need to schedule based on calendrical values - rather than just elapsed time, basically - then you either need to wrap Timer in your own code, or use a library which has already been built for this purpose. In this case, I suspect the Quartz Scheduler is your best bet.
Given how complicated date/time can be, I'd generally recommend using a well-known library over rolling your own code. Note that this often doesn't mean that you can get away without thinking about complicated aspects of the problem - it just means that you should be able to express your requirements fairly simply. For example, in the context you're looking at, you should consider:
What time zone do you want the "9pm" to be expressed in? Is it the system default time zone? Some other specific one? Multiple different time zones for different tasks?
What do you want to happen if the scheduled time doesn't occur, or occurs twice on one day? You're likely to be okay with 9pm, but if you had (say) 1.30am in the UK time zone, when the clocks go forward into BST, that will be skipped for that day - and when the clocks go back into GMT, it will occur twice.
How do you want to handle the system clock being changed, either manually or automatically?
You can schedule the timer to run the task each hour and let the task decide when to actually run using Calendar
If my application triggers an event at 9pm EST, It should also trigger this event at 6pm PST.
Currently I am parsing a feed and this feed says that the event will run at 9 pm EST. Without altering the feed, what is the best way to make my code universal to anyone that opens the application in any timezone.
You always store dates as a UTC time (with a time zone stored separately). Then all alarm devices can easily convert that into their own local time, and make the alarm at the appropriate time.
Alarms that should be at 9am no matter what time zone then is stored without a time zone.
You can get the device's current time zone like so:
TimeZone.getDefault()
....and then convert the current time of device and the time of feed to GMT.
Now you must test if they are equal.
use TimeToNow by using prettytime http://www.ocpsoft.org/prettytime/
it will give you the result as 4 hour ago or moment ago which will remove the time zone factors
I am working on google app engine. And I am working on something that requires me to do something if the time-difference between the time sent by the client and the time at server is less than 15 second. The code works perfectly when I try it on the test server (on my own machine). But fails when I the code is deployed on app-engine. I am guess that is probably because if the timezone at server is different, there might be few hours added/subtracted when that occurs.
What I basically do is let the client send his timestamp along with so other data. And when the server subsequently needs to calculate the time difference, I subract server's timestamp from the client's.
If the timezone is different then clearly I will run into problems.
I know that one solution to avoid this timezone fiasco is to let the server timestamp both the initial request and subsequent processing later on, but for my application, it is essential that a timer starts ticking right from when the client makes a request and that if 15 seconds have passed, with respect to the client, then no action be taken by the server.
Since I am the developer of both client side and server side I can show you what I have done.
Sample client side call
new Register().sendToServer(username,location,new TimeStamp(new Date().getTime()));
this is stored in data-store and retrieved a little bit later, if some conditions are met.
Server side sample to find difference
Date date= new Timestamp(new Date().getTime());
Date d1 = (Date) timeStampList[i];
long timedif = Math.abs(d1.getTime() - date.getTime());
if(timedif <= 15000)
do something;
else
do something else;
So basically, my question is how do I normalize the timezone variations ?
thanks
Simpy use absolute unix time: System.currentTimeMillis().
This time is absolute, i.e. no of miliseconds since Jan 1st, 1970, UTC midnight), so you can simply calculate the difference.
As #Diego Basch pointed out if the time difference is in the 30 minutes to full hours magnitude, you should deal with the timezone difference, because your client is not in the same timezone as the server.
You can determine the client timezone in JavaScript with new Date().getTimezoneOffset(). Note: the offset to UTC is returned in minutes.
On the server you should be able to determine the timezone with Calendar.getInstance().getTimeZone().