I am working on google app engine. And I am working on something that requires me to do something if the time-difference between the time sent by the client and the time at server is less than 15 second. The code works perfectly when I try it on the test server (on my own machine). But fails when I the code is deployed on app-engine. I am guess that is probably because if the timezone at server is different, there might be few hours added/subtracted when that occurs.
What I basically do is let the client send his timestamp along with so other data. And when the server subsequently needs to calculate the time difference, I subract server's timestamp from the client's.
If the timezone is different then clearly I will run into problems.
I know that one solution to avoid this timezone fiasco is to let the server timestamp both the initial request and subsequent processing later on, but for my application, it is essential that a timer starts ticking right from when the client makes a request and that if 15 seconds have passed, with respect to the client, then no action be taken by the server.
Since I am the developer of both client side and server side I can show you what I have done.
Sample client side call
new Register().sendToServer(username,location,new TimeStamp(new Date().getTime()));
this is stored in data-store and retrieved a little bit later, if some conditions are met.
Server side sample to find difference
Date date= new Timestamp(new Date().getTime());
Date d1 = (Date) timeStampList[i];
long timedif = Math.abs(d1.getTime() - date.getTime());
if(timedif <= 15000)
do something;
else
do something else;
So basically, my question is how do I normalize the timezone variations ?
thanks
Simpy use absolute unix time: System.currentTimeMillis().
This time is absolute, i.e. no of miliseconds since Jan 1st, 1970, UTC midnight), so you can simply calculate the difference.
As #Diego Basch pointed out if the time difference is in the 30 minutes to full hours magnitude, you should deal with the timezone difference, because your client is not in the same timezone as the server.
You can determine the client timezone in JavaScript with new Date().getTimezoneOffset(). Note: the offset to UTC is returned in minutes.
On the server you should be able to determine the timezone with Calendar.getInstance().getTimeZone().
Related
Time (20 seconds validity) based google authentication code, i need to check the time before reading the 4 digit code.
Collect the google auth code using TOTP
Apply the code automatically in our application
Problem,
while reading - code at the edge (18/19th seconds), and send the code automatically to our text box, but validity expired and authentication was failed. so i want to check the code along with validity time
a. if validity time greater than 10 seconds i can get the code and pass it to text box
b. if validity time less than 10 seconds ,wait for 10 seconds
code:
public static String getTOTPCode(String secretKey) {
String normalizedBase32Key = secretKey.replace(" ", "").toUpperCase();
Base32 base32 = new Base32();
byte[] bytes = base32.decode(normalizedBase32Key);
String hexKey = Hex.encodeHexString(bytes);
return TOTP.getOTP(hexKey);
}
Jar file
commons-code1.8 jar
totp-1.0 jar
refer the above, and let me know how can get the validity time for the OTP?
See about TOTP. Server should generally allow codes from ±1 time interval anyway. So most probably you don't have to be concerned with such artificial delays.
Other than that, depending on TOTP parameters, you should know when is the tripping point of next time intervals. So you can just check how close you are to a tripping point based on current time.
P.S. I heard some servers adjust time calculations based on client previous auth attempts so that client/server time shifts do not break auth. e.g. when client machine doesn't use NTP and thus clock does off.
update: about generating timestamp TOTP : Do the seconds count?
This question already has an answer here:
I want to get current time in Java but without the Internet and without the system time
(1 answer)
Closed 6 years ago.
I am new here:) I am developing a small app in android (java) and I need something to control the seconds elapsed between two events.
The problem with System.currentTimeMillis () is that the user of the device can change his system date for example after the first event, and so when I take the value returned by System.currentTimeMillis () after the second event, and I make the difference between the two values, this obtained difference is not valid at all.
Another option I tried was System.nanoTime (). Although the user changes his system time, the seconds count is valid. But here the problem is that if after the first event, the user switches off the device, the value returned by System.nanoTime() after the second event is not valid because with the device restart, the counter of System.nanoTime() also restarts , and therefore, the elapsed time is again not valid.
Does anybody know any method to count the seconds between two events, considering user date changes and user restarts of the device ?
Thanks in advance!
Since you want to avoid the errors that can be introduced by the user messing up with the system date, you cannot rely on that source for information. A reliable time (and accurate, if matters) can be obtained using the NTP (Network Time Protocol) protocol. Take a look at this question for more details about it: Java NTP client.
An alternative you may consider is, instead of finding a reliable clock to compute the date/time difference, you can check if the user has changed the system clock. A simple way would be to store the timestamp and check periodically (every second of so) if the new timestamp you get from the system clock is smaller (before) the previous one. If so, you can take action.
If my application triggers an event at 9pm EST, It should also trigger this event at 6pm PST.
Currently I am parsing a feed and this feed says that the event will run at 9 pm EST. Without altering the feed, what is the best way to make my code universal to anyone that opens the application in any timezone.
You always store dates as a UTC time (with a time zone stored separately). Then all alarm devices can easily convert that into their own local time, and make the alarm at the appropriate time.
Alarms that should be at 9am no matter what time zone then is stored without a time zone.
You can get the device's current time zone like so:
TimeZone.getDefault()
....and then convert the current time of device and the time of feed to GMT.
Now you must test if they are equal.
use TimeToNow by using prettytime http://www.ocpsoft.org/prettytime/
it will give you the result as 4 hour ago or moment ago which will remove the time zone factors
I would like to set arbitrary time in application. Time is downloaded from server in milliseconds format- it should be independent from locale and other system preferences.
But application reacquire thread safety solution, and object like standard not thread safety Calendar object.
Whats is the best way?
Today I use:
Calendar calendar = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
calendar.setTimeInMillis(serverTime);
But is not good way for me beacuse, is't thread safe.
tl;dr
Program have to contain own internal clock fetching time from external server. Clock must be thread-safe.
The time in milliseconds that a Java application uses is
the specified number of milliseconds since the standard base time
known as "the epoch", namely January 1, 1970, 00:00:00 GMT.
This number is based on the GMT time zone. If you need to print it in another time zone you can use any formatting class you want, say SimpleDateFormat. If you need to make the variable that holds it thread safe, just synchronize on it, possibly by wrapping it in a class.
public class TimeInMillis {
private volatile long time;
public void setTime(long time) {this.time = time;}
public time getTime() {return time;}
}
Whenever you need to display it, just get the TimeInMillis object, get the time and create a Calendar object with it. Then use a formatting class to print the time in the format, locale, timezone, you require.
Time is downloaded from server in milliseconds format- it should be independent from locale and other system preferences
That isn't "time". That is a timestamp, meaning a particular time value reported by a particular piece of software at a particular point in time.
Now, if you are trying to say that, in future communications with this server, you want to translate time as reported by the device into the timebase as known by the server, that makes at least a bit of sense. In that case, you compute the delta between the device time when you receive the timestamp and the time value in the timestamp itself. Then, you apply that delta to future times you report back to the server.
Program have to contain own internal clock fetching time from external server
That makes no sense whatsoever.
In this universe, based on our current knowledge of physics, time is continuous and linear. Time does not change only when you are "fetching time from an external server". Again, what you are "fetching... from an external server" is a timestamp, a statement of what the clock on the server thought the time was at the time you made the request. You can use that timestamp for comparison purposes with other timestamps from that server, and you can use that timestamp to compare to the device's current time to determine the difference.
However, you cannot create hardware in Java code, and so you cannot create "internal clock" in Java code. The only clock is the device's clock. As noted earlier, you can translate the device's clock's time to the timebase of the server by applying the aforementioned difference, but the passage of time is still being marked by the device's own clock.
Since the difference is going to be an int or long, you can use AtomicInteger or AtomicLong if you are concerned about multiple threads working with the value at once.
I would just the the time in milli-second with GMT (BTW computers don't support UTC as such)
long serverTime = System.currentTimeMillis(); // millis since 1/1/1970 GMT.
To get/set this in a thread safe manner you can make it volatile
BTW Calender is pretty slow even on a PC. I would avoid using it on a phone.
In my application I have a Server and x Clients. When a Client starts, he obtained the current System time from the Server. Every Client has to work with the Server time and can´t use his own System time.
Now my Question: What is the best way to run a clock on the Client that starts with the current Server time and run nearly synchronous to it without to receive the Server time every x seconds?
The goal is to display a runing clock with the Server time on the client.
The tolerance that the client clock may have is about 1second in 24hours.
In my solution I got a Timer that trigger every 500ms and count 500ms on the Server time when the Timer executes. But this is not a good solution :) because the Client clock differ from the Server time.
Thanks for your reply
You should almost certainly use an established clock synchronization method such as the Network Time Protocol rather than building your own custom solution. It will provide you with better results than you will make yourself, and you have the added benefit that all your servers agree about what time it is :-)
I find a Solution that fits perfectly for my situation.
Instead of using System.currentTimeMillis(), I'm using System.nanoTime().
System.nanoTime() is independent from the System clock.
When I receive the current Server time, I save additional the ns from the System. Then the current Server time will be calculated with the difference between the ns time from Server time receiving and the current nanoTime plus the Server time.
Example:
// The Client starts and receive the current Server time and the nanoTime
private long serverTime = server.getCurrentTime();
private long systemNano = System.nanoTime();
//To calculate the current Server time without a service call
//ns-> ms
long currentServerTime = serverTime + ((System.nanoTime() - systemNano) / 1000000);
Thx
One way to do this is to get the difference between the server time and the local time and use that for time calculations
The client starts and gets the time from the server (I am going to assume use the current time in milliseconds but this can be adapted to whatever you are using)
The client checks its current system time and saves the difference
The difference can then always be applied to the current system time on the client to calculate the server time
Example:
long serverTime = 1328860926471l; // 2012/02/10 10:02, received from wherever
long currentTime = System.currentTimeMillis(); // current client time
long difference = currentTime - serverTime;
// Server time can then me retrieved like this:
long currentServerTime = System.currentTimeMillis() - difference;
Date serverTimeDate = new Date(currentServerTime);
Obviously the difference must be saved the moment the server time is received.