Anagha
Anagha

Reputation: 3699

What is the difference between UTC and GMT?

I have a few queries regarding the Time zones:

  1. Can the time be captured in UTC alone?
  2. Is UTC -6 and GMT -6 the same, and does that mean it is US local time?
  3. Say, I have UTC time as "02-01-2018 00:03" does that mean my US local time is "01-01-2018 18:00"?

I have searched on Wikipedia and many related websites but haven't found a relevant explanation.

Upvotes: 246

Views: 279846

Answers (5)

Basil Bourque
Basil Bourque

Reputation: 339855

❌ The accepted Answer is neither correct nor useful.

✅ In contrast, the Answer by Anonymous correctly summarizes the technical differences — for details follow the links to detailed pages in Wikipedia.

For programmers building business-oriented apps, the upshot is that UTC is the new GMT. You can use the terms interchangeably, with the difference being literally less than a second. So for all practical purposes in most apps, no difference at all.

Here is some more practical advice, with code examples.

Strings

Say, I have UTC time as "02-01-2018 00:03" does that mean my US local time is "01-01-2018 18:00"?

That first part is a bad example, with the date-time string lacking an indicator of its offset or zone.

If a string indicates a specific moment, it must indicate either a time zone (Continent/Region formatted name) and/or an offset-from-UTC as a number of hours-minutes-seconds. If the string is meant to represent a moment at UTC itself, that means an offset-from-UTC of zero.

To write that string with an offset, various conventions may be applied. The best in practice is with both hours and minutes along with a colon, such as +00:00, +05:30, or -08:00. The leading zero and the colon are both optional but I have seen libraries break when encountering a value such as -0800 or -8.

Zulu

As a shortcut for an offset of zero, the letter Z is commonly used to mean UTC itself. Pronounced Zulu.

ISO 8601

Furthermore, best practice in formatting date-time textually for computing is to us the ISO 8601 standard formats. For a date-time the format YYYY-MM-DDTHH:MM:SS±HH:MM:SS is used. The T separates the date portion from the time-of-day portion. This format has advantages such as being largely unambiguous, easy to parse by machine, easy to read by humans across cultures. Another advantage is sorting alphabetically is also chronological. The standard accepts the Z abbreviation as well.

So your example UTC time as "02-01-2018 00:03" is better stated as 2018-01-02T00:03Z.

java.time

Be very aware that most programming languages, libraries, and databases have very poor support for date-time handling, usually based on a poor understanding of date-time issues. Handling date-time is surprisingly complicated and tricky to master.

The only decent library I have encountered is the java.time classes (see Tutorial) bundled with Java 8 and later, and its predecessor the Joda-Time project (also loosely ported from Java to .Net in the Noda Time project).

In java.time, a moment is represent in three ways. All have a resolution of nanoseconds.

  • Instant
    Always in UTC. Technically, a count of nanoseconds since the epoch reference of first moment of 1970 (1970-01-01T00:00:00Z).
  • OffsetDateTime
    A date with time-of-day in the context of a certain number of hours-minutes-seconds ahead of, or behind, UTC.
  • ZonedDateTime
    A date with time-of-day in the context of a certain time zone.

So what is the difference between a time zone and an offset-from-UTC? Why do we need separate classes? An offset-from-UTC is simply a number of hours-minutes-seconds, three numbers, no more, no less. A time zone in much more. A time zone is a history of the past, present, and future changes to the offset used by the people of a particular region.

What changes? Changes dictated by the whims or wisdom of their politicians. Politicians around the world have shown a predilection for changing the offset used by the time zone(s) in their jurisdiction. Daylight Saving Time (DST) is one common pattern of changes, with its schedule often changed and the decision to enact or revert from DST sometimes changed. Other changes happen too, such as just in the last few years North Korea changing their clock by half-an-hour to sync with South Korea, Venezuela turning back their clock half-an-hour only to jump back forward less than a decade later, Turkey this year canceled the scheduled change from DST to standard time with little forewarning, and contemporary Russia having made multiple such changes in recent years.

Back to your example in your point # 3, let's look at some code.

Say, I have UTC time as "02-01-2018 00:03" does that mean my US local time is "01-01-2018 18:00"?

Your example strings have another problem. That 03 minute in the first part is ignored your second part, an apparent typo. I know because there is no time zone adjustment in effect in the Americas on that date involving a fractional hour of 57 minutes.

Not a moment

First, we parse your input string. Lacking any indicator of zone or offset, we must parse using the LocalDateTime. The name LocalDateTime may be misleading, as it does mean a specific locality. It means any or all localities. For more explanation, see What's the difference between Instant and LocalDateTime?.

String input = "2018-01-02T00:03" ;                  // Text of a date with time-of-day but without any context of time zore or offset-from-UTC. *Not* a moment, *not* a point on the timeline.
LocalDateTime ldt = LocalDateTime.parse( input ) ;   // Parsing the input as a `LocalDateTime`, a class representing a date with time but no zone/offset. Again, this does *not* represent a moment, is *not* a point on the timeline. 

UTC

By the facts given in the Question, we know this date and time was intended to represent a moment in UTC. So we can assign the context of an offset-from-UTC of zero hours-minutes-seconds for UTC itself. We apply a ZoneOffset constant UTC to get a OffsetDateTime object.

OffsetDateTime odt = ldt.atOffset( ZoneOffset.UTC );    // We are certain this text was intended to represent a moment in UTC. So correct the faulty text input by assigning the context of an offset of zero, for UTC itself.

Time zone

The Question asks to see this moment through a wall-clock time of six hours behind UTC used in the United States. One time zone with such an offset is America/Chicago.

Specify a proper time zone name in the format of continent/region, such as America/Montreal, Africa/Casablanca, or Pacific/Auckland. Never use the 2-4 letter abbreviation such as CST, EST, or IST as they are not true time zones, not standardized, and not even unique(!).

ZoneId z = ZoneId.of( "America/Chicago" ) ; // Adjust from UTC to a time zone where the wall-clock time is six hours behind UTC.
ZonedDateTime zdt = odt.atZoneSameInstant( z ) ;

See this code run live at IdeOne.com.

odt.toString(): 2018-01-02T00:03Z

zdt.toString(): 2018-01-01T18:03-06:00[America/Chicago]

Same moment, different wall-clock time

This odt and zdt both represent the same simultaneous moment, the same point on the timeline. The only difference is the wall-clock time.

Let's work an example, using Iceland where their time zone uses an offset-from-UTC of zero hours-minutes-seconds. So the zone Atlantic/Reykjavik has a wall-clock time identical to UTC. At least currently today their wall-clock time matches UTC; in the past or future it may be different, which is why it is incorrect to say “UTC is the time zone of Iceland”. Anyway, our example… say someone in Reykjavík, Iceland with 3 minutes after midnight on the clock hanging on their wall makes a phone call to someone in the US. That US person lives in a place using the Chicago region time zone. As the person called picks up their phone, they glance up at the clock hanging on their wall to see the time is just after 6 PM (18:03). Same moment, different wall-clock time.

Also, the calendars hanging on their walls are different, as it is “tomorrow” in Iceland but “yesterday” in mainland US. Same moment, different dates!


Table of date-time types in Java, both modern and legacy.


About java.time

The java.time framework is built into Java 8 and later. These classes supplant the troublesome old legacy date-time classes such as java.util.Date, Calendar, & SimpleDateFormat.

The Joda-Time project, now in maintenance mode, advises migration to the java.time classes.

To learn more, see the Oracle Tutorial. And search Stack Overflow for many examples and explanations. Specification is JSR 310.

You may exchange java.time objects directly with your database. Use a JDBC driver compliant with JDBC 4.2 or later. No need for strings, no need for java.sql.* classes.

Where to obtain the java.time classes?

Table of which java.time library to use with which version of Java or Android

The ThreeTen-Extra project extends java.time with additional classes. This project is a proving ground for possible future additions to java.time. You may find some useful classes here such as Interval, YearWeek, YearQuarter, and more.

Upvotes: 180

aderchox
aderchox

Reputation: 4074

Although this question is not a coding question, programmers bump into it pretty often, so a non-programming and easy to understand answer to it would be worthy as well.

In simple terms, we can assume that both GMT and UTC are "clocks" that tell us "the time", however, UTC is the more precise one, so took GMT's place in our measurement of the time around the world.

But wait, what does "a more precise clock" mean at all? To understand this, we need to understand what "the time" itself is in the first place!

Time (in its today's form, i.e., months, days, hours, minutes, etc.) is a human-made thing, and is our "means of measuring the passage of events". Say you're planning a programming event next month. What is next month? So you need to have an a shared and wide-spread understanding of the time (the amount of the passage of the events) we all consider "next month"!

Our today's shared understanding of "the time" is the result of a convention/agreement our ancestors left for us: They searched for some regularly recurring event, and they initially found it in the change of day to night, and in the change of seasons. (If you notice, these are astronomical events, i.e., rotation of the earth around its own axis, and sun's position in the sky. This is what the GMT clock (timezone) uses to measure the time). But there was still room for improvement:

The measurement of time began with the invention of sundials in ancient Egypt some time prior to 1500 B.C. However, the time the Egyptians measured was not the same as the time today's clocks measure. For the Egyptians, and indeed for a further three millennia, the basic unit of time was the period of daylight. source

So basically, you could tell someone "I'll see you tomorrow", but you couldn't tell them "I'll see you tomorrow at 7:30"! :D

So the next [obvious] step for them was to make their measurement of the time more precise, and as a result, they agreed (not surprisingly) to divide each full day into 24 parts, aka "hours", and they agreed to divide each hour into 60 minutes, and so on. So today you have something called "a clock" that allows you "to universally keep your understanding of every point of the past, of the current and of the future events in sync with others", therefore you can happily plan your event next month, without an issue (well, without a "temporal issue" at least!).

Why was UTC needed then? UTC was needed for the exact same reason, it was a further step in this interminable road towards our unquenchable thirst for more and more precision. As we've said above already, GMT used some not-stable-enough astronomical events to tell us the time, i.e., events not as precisely regular as we expect to set as the basis of our modern agreement for measuring the time.

Don't get me wrong, actually, the GMT clock is still good enough for numerous purposes, but it's just like a ruler that measures in units only as small as millimeters, while we need greater length precisions as well definitely once we get into realms like engineering, science and so on.

Here's where UTC comes in wonderfully. UTC measures "frequency of an atom's vibration" (*), and that happens to be a much more stable and regularly recurring event compared to things like earth's rotations or sun's position in the sky. Indeed, we're talking about 1 billionths of a second per day v.s up to a few minutes per day! So no wonder UTC can keep us in sync in a much more reliable way.

That's it, ...for the most part! If you're a curious mind, there's still one more delicate (and important) point we've missed to mention here. If you think about it, UTC is great in terms of its precise measuring of the time, however, it also creates a (minor) issue: We actually do care about what a full day is! This is regardless of how we measure the time! No matter how accurate a clock is, no matter how precisely it keeps us in sync to each other, we do not like it to make our days dark and to make our nights bright! (with a bit of exaggeration). So we need to occasionally adjust the UTC clock, and this is exactly what we do, using a second aka a "leap second" to . From Wikipedia:

A leap second is a one-second adjustment that is occasionally applied to Coordinated Universal Time (UTC), to accommodate the difference between precise time (International Atomic Time (TAI), as measured by atomic clocks) and imprecise observed solar time (UT1), which varies due to irregularities and long-term slowdown in the Earth's rotation.


(*) Disclaimer: I'm neither a Physicist nor an Astronomer, and these topics are very complicated, so this answer should have some inaccuracies naturally. If you need a deeper and more accurate understanding, read further in this answer on Physics Stack Exchange.

Upvotes: 1

Abdullah Ahmed Ghaznavi
Abdullah Ahmed Ghaznavi

Reputation: 2099

There is no time difference between Coordinated Universal Time and Greenwich Mean Time.

7:17 AM Friday, Coordinated Universal Time (UTC) is
7:17 AM Friday, Greenwich Mean Time (GMT)

Key difference: Both UTC and GMT are time standards that differ in terms of their derivation and their use.

To quote timeanddate.com:

The Difference Between GMT and UTC:

Greenwich Mean Time (GMT) is often interchanged or confused with Coordinated Universal Time (UTC). But GMT is a time zone and UTC is a time standard.

Although GMT and UTC share the same current time in practice, there is a basic difference between the two:

  • GMT is a time zone officially used in some European and African countries. The time can be displayed using both the 24-hour format (0 - 24) or the 12-hour format (1 - 12 am/pm).
  • UTC is not a time zone, but a time standard that is the basis for civil time and time zones worldwide. This means that no country or territory officially uses UTC as a local time.

Upvotes: 67

Ben Holland
Ben Holland

Reputation: 61

GMT is a mean solar time calculated at the Greenwich meridian. https://www.rmg.co.uk/discover/explore/greenwich-mean-time-gmt

UTC is based on the extremely regular "ticking" of caesium atomic clocks. https://en.wikipedia.org/wiki/Coordinated_Universal_Time

They are neither based on the same time nor calculated the same way. IMHO, the wording on https://currentmillis.com is misleading, at best, if not just flat out incorrect.

Upvotes: 6

Anonymous
Anonymous

Reputation: 86359

Astronomy versus Atomic clock

By the original definitions the difference is that GMT (also officially known as Universal Time (UT), which may be confusing) is based on astronomical observations while UTC is based on atomic clocks. Later GMT has become to be used at least unofficially to refer to UTC, which blurs the distinction somewhat.

GMT stands for Greenwich Mean Time, the mean solar time at the Royal Observatory in Greenwich on the south bank in Eastern London, UK. When the sun is at its highest point exactly above Greenwich, it is 12 noon GMT. Except: The Earth spins slightly unevenly, so 12 noon is defined as the annual average, mean of when the sun is at its highest, its culmination. In GMT there can never be any leap seconds because Earth’s rotation doesn’t leap.

UTC, which stands for Coordinated Universal Time in English, is defined by atomic clocks, but is otherwise the same. In UTC a second always has the same length. Leap seconds are inserted in UTC to keep UTC and GMT from drifting apart. By contrast, in GMT the seconds are stretched as necessary, so in principle they don’t always have the same length.

For roughly 100 years GMT was used as the basis for defining time around the world. Since the world these days mostly bases precise definition of time on atomic clocks, it has become customary to base the definition of time on UTC instead.

Edit: The original meaning of GMT is somewhat useless these days, but the three letter combination doesn’t seem to go away. I take it that it is often used without regard to whether UTC is really intended, so don’t put too much trust into the strict definition given above.

For your questions:

  1. Yes, time can be captured in UTC alone. Storing time in UTC and using UTC for transmitting date-time information is generally considered good practice.
  2. I suppose it’s up to each state of the US to define its time. And I don’t know, but I suppose that today they (officially or in practice) define time as an offset from UTC rather than GMT. The difference between the two will always be less than a second, so for many purposes you will not need to care. Central Standard Time (for example America/Chicago) is at offset -6, as is Mountain Daylight Time (for example America/Denver). On the other hand, offset -6 doesn’t necessarily imply a time in the US. Parts of Canada and Mexico use it too, of course, and also Galapagos and Easter Island.
  3. I don’t think you got your example time exactly right, but yes, 2 January 2018 at 00:00 UTC is the same point in time as 1 January 2018 at 18:00 in Chicago and other places that are at UTC-6 in winter (winter on the Northern hemisphere, that is).

Further reading:

Upvotes: 231

Related Questions