zishu's blog

zishu's blog

一个热爱生活的博主。https://zishu.me

Why does getTime() return milliseconds from 1970 to now?

Today, while writing new Date(), I accidentally discovered an interesting method, getTime(). I searched on Baidu and found that it calculates the number of milliseconds from January 1, 1970, to the present.

Why is it 1970?

new Date().getTime();

// xxxxxxxxxxx

This originates from the birth of Unix. Unix was developed in 1969 and officially released in 1971. Before that, there was no need for any machine to represent time before 1970-01-01-00:00:00. Many languages later adopted this convention, and JavaScript just followed this convention.

Of course, this practice is problematic now. For example, it is not convenient to use it to represent earlier times, and the precision is limited.

Defining time starting from January 1, 1970, I suddenly remembered that in Java, the time in Oracle databases is also calculated from January 1, 1970.

For example, in Java:

Date date = new Date(0);

System.out.println(date);

// The printed result: Thu Jan 01 08:00:00 CST 1970

It is January 1, 1970, and the actual time is 0:00:00 (it is printed as 8:00, which will be explained later).

Why was this time defined as January 1, 1970?

So I started Googling, but I couldn't find the answer on Chinese web pages. So I tried searching with English keywords and finally found an accurate post on the Sun Java forum:

http://forums.sun.com/thread.jspa?threadID=595140&start=15

There is a reply:

I suspect that Java was born and raised on a UNIX system.
UNIX considers the epoch (when did time begin) to be midnight, January 1, 1970.

It means that Java originated from the UNIX system, and UNIX considers 0:00 on January 1, 1970, as the time epoch.

But this still doesn't explain "why." Out of curiosity, I continued to Google and finally found the answer:

http://en.wikipedia.org/wiki/Unix_time

The explanation here is:

Originally, computer operating systems were 32-bit, and time was also represented in 32 bits.

System.out.println(Integer.MAX_VALUE);

2147483647

In Java, Integer is represented in 32 bits, so the maximum value that can be represented in 32 bits is 2147483647. The total number of seconds in one year (365 days) is 31536000. 2147483647/31536000 = 68.1, which means that the longest time that can be represented in 32 bits is 68 years. In reality, on January 19, 2038, at 03:14:07, the maximum time will be reached. After this time, the time on all 32-bit operating systems will become 10000000 00000000 00000000 00000000, which is December 13, 1901, 20:45:52. This will cause a phenomenon of time regression, and many software will run abnormally.

So far, I think the answer to the question has emerged:

Because using 32 bits to represent time has a maximum interval of 68 years, and the earliest UNIX operating system took into account the era of computer generation and the time limit of applications, it chose January 1, 1970, as the epoch time (starting time) for UNIX TIME, and Java naturally follows this constraint.

As for the phenomenon of time regression, I believe it will gradually be solved with the emergence of 64-bit operating systems, because a 64-bit operating system can represent time up to December 4, 292,277,026,596, 15:30:08. I believe that even our future generations, even on the day when the earth is destroyed, will not have to worry about it being insufficient, because this time is already billions of years in the future.

The last question:

In the above System.out.println(new Date(0)), the printed time is 8:00 instead of 0:00. The reason is the existence of system time and local time issues. In fact, the system time is still 0:00, but my computer's time zone is set to GMT+8, so the printed result is 8:00.

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.