Unix Time Conversion between Language

I have this unix timestamp in milliseconds: -62135769600000

I tried to convert the timestamp using java and javascript, but I got two different response. look below:

Date d = new Date(-62135769600000L);
System.out.println(d); // output = Sat Jan 01 00:00:00 GMT 1
d = new Date(-62135769600000L)
console.log(d) // output = Fri Dec 29 0000 19:00:00 GMT-0500 (EST)

As you can see, we have two different results. I want to understand why and if possible how to fix that.

2 thoughts on “Unix Time Conversion between Language”

  1. That is because both Date and Date have two very different implementations.

    There are actually two differences here: the format of the output text and the actual date represented by the output.

    The different date

    I’m still investigating why Java’s Date would return 1 January 0001. I do know that JavaScript returns the correct date value. Java’s Date is broken, don’t use it. Instead, you should use the modern Java Date and Time API, available in the java.time package.

    The following Java code returns the correct date:

    OffsetDateTime odt = Instant.ofEpochMilli(-62135769600000L)

    The different formatting

    Both Java’s and JavaScript’s Date implement toString() in a different way. toString(). toString()` always returns a textual representation of the object on which the method is called — but how completely depends on what the author had in mind.

    You should never rely on toString() to be in a certain format. If you do want to format your string, use DateTimeFormatter in Java.

  2. First, a lesson: java.util.Date is dumb, broken, and should never be used. The right way to do time stuff in java is with the java.time package.

    Then, to explain your observed differences:

    • javascript is reporting the time by translating the moment-in-time as represented by using the gregorian calendar, and as if you were in the EST timezone.
    • java’s Date is reporting the time as if it was the julian calendar, and at the GMT timezone.

    The timezone difference explains why javascript prints 19:00 and java prints 00:00, as well as 1 date’s worth of difference. When you clap your hands at 19:00 in New York on dec 29th, at that very instant in time, it is exactly midnight, dec 30th, in london. The remaining exactly 48 hours (2 days) of difference is because of the julian calendar.

    The right way, in java:

    Instant i = Instant.ofEpochMilli(-62135769600000L);
    > 0000-12-30T00:00Z

    The calendar system commonly used today is called the gregorian calendar. It is a modification of what was used before, the julian calendar, which was in use in roman times.

    The world did not switch from julian to gregorian on the same day. In fact, russia was very late and switched only in 1918: You bought a newspaper in moscow and it read ‘January 31st, 1918’ (but in russian). If you then hopped on a horse and rode west until you hit prague or berlin or amsterdam, all overnight, and asked: Hey, what date is today? They’d tell you: It’s February 13th.

    THen you ride back to moscow the next day and buy another newspaper. It reads ‘February 14th, 1918’. What happened to february 1st – 13? They never existed. That was the time russia switched.

    FUN FACT: The ‘october revolution’, which refers to when the tzars were overthrown, happened in november. Russia, still on the julian, were in october when that happened. The rest of the world using roman-inspired calendars were all on gregorian, and they called it november then.

    Even though this is a dumb thing to do, java.util.Date tries to surmise what you really wanted and has an arbitrary timestamp: When you make a Date object that predates this, it renders times in julian calendar.

    This is silly (the world didn’t switch all in one go), so neither javascript nor java.time do this.

    That explains the 2 days.


Leave a Comment