mas
mas

Reputation: 1176

Why is DateTime based on Ticks rather than Milliseconds?

Why is the minimum resolution of a DateTime based on Ticks (100-nanosecond units) rather than on Milliseconds?

Upvotes: 23

Views: 31614

Answers (5)

InGeek
InGeek

Reputation: 2682

Just for the information:

1 millisecond = 10 000 ticks

1 second = 10 000 000 ticks

Using difference (delta) of two ticks you can get more granular precision (later converting them to millisecond or seconds)

In a C# DateTime context, ticks starts from 0 (DateTime.MinValue.Ticks) up until DateTime.MaxValue.Ticks

new DateTime(0)                          //numbers between 0 and (864*10^9-1) produces same date 01/01/0001        
new DateTime(DateTime.MaxValue.Ticks)    //MaxValue tick generates 12/31/9999

System time ticks are incremented by 864 billion ticks per day.

Upvotes: 7

CodesInChaos
CodesInChaos

Reputation: 108810

  • TimeSpan and DateTime use the same Ticks making operations like adding a TimeSpan to a DateTime trivial.
  • More precision is good. Mainly useful for TimeSpan, but above reason transfers that to DateTime.

    For example StopWatch measures short time intervals often shorter than a millisecond. It can return a TimeSpan.
    In one of my projects I used TimeSpan to address audio samples. 100ns is short enough for that, milliseconds wouldn't be.

  • Even using milliseconds ticks you need an Int64 to represent DateTime. But then you're wasting most of the range, since years outside 0 to 9999 aren't really useful. So they chose ticks as small as possible while allowing DateTime to represent the year 9999.

    There are about 261.5 ticks with 100ns. Since DateTime needs two bits for timezone related tagging, 100ns ticks are the smallest power-of-ten interval that fits an Int64.

So using longer ticks would decrease precision, without gaining anything. Using shorter ticks wouldn't fit 64 bits. => 100ns is the optimal value given the constraints.

Upvotes: 48

Soner Gönül
Soner Gönül

Reputation: 98760

From MSDN;

A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond.

A tick represents the total number of ticks in local time, which is midnight on January 1st in the year 0001. But a tick is also smallest unit for TimeSpan also. Since ticks are Int64, so if miliseconds used instead of ticks, there can be a information losing.

Also could be a default CLS implementation.

Upvotes: 6

GeorgeVremescu
GeorgeVremescu

Reputation: 1253

The tick is what the system clock works with.

Upvotes: -5

Erix
Erix

Reputation: 7105

for higher time resolution, even though you don't need it most of the time.

Upvotes: 2

Related Questions