Reputation: 17
i am working on easy game(just for fun).
I have server in c
and client in java
.
I want get actual time on server
and on client
, but I can not come to the same results time.
On server i am using:
// the system time
SYSTEMTIME systemTime;
GetSystemTime(&systemTime);
// the current file time
FILETIME fileTime;
SystemTimeToFileTime(&systemTime, &fileTime);
// filetime in 100 nanosecond resolution
ULONGLONG fileTimeNano100;
fileTimeNano100 = (((ULONGLONG) fileTime.dwHighDateTime) << 32) + fileTime.dwLowDateTime;
//to milliseconds and unix windows epoche offset removed
ULONGLONG posixTime = fileTimeNano100 / 10000 - 11644473600000;
return posixTime;
And i am getting time in format(output): 1750721123
On client i am using
long lDateTime = new Date().getTime();
System.out.println("Date() - Time in milliseconds: " + lDateTime);
Calendar lCDateTime = Calendar.getInstance();
System.out.println("Calender - Time in milliseconds :" + lCDateTime.getTimeInMillis());
And i am gettin format(output):
Calender - Time in milliseconds :1419089968022
Date() - Time in milliseconds: 1419089968022
Why? Where is the problem? How can i get the same TIME?
Both programs run on the same pc(win 8.1)
Upvotes: 1
Views: 166
Reputation: 459
First of all. First block of code doesn't seem to be standard C code at all or rather you use some libraries that I just simply doesn't know.
There is no simple way to get actual time with less than a second accuracy in standard C. But here is the example with Java and C that actually works, so I hope this would help.
Java
package stackOverflow;
import java.util.Date;
public class Main {
public static void main(String[] args) {
long lDateTime = new Date().getTime();
System.out.println(lDateTime/1000);
}
}
Output: 1436200408
C
#include<stdio.h>
#include<stdlib.h>
#include<sys/time.h>
#include<time.h>
int main(void) {
struct timeval tv;
gettimeofday(&tv, NULL);
printf("%ld\n", tv.tv_sec);
return 0;
}
Output: 1436200418
Upvotes: 1