What is the standard time precision of DateTime?

I made a small winform program to transfer data to Visual Studio and I used the method to provide transfer times. After completing the transfer, the program will return a dialog box to show the time.

But here I don't know what is the precision of the time or the resolution of the timer, how can it be such precision, even a microsecond?

var startTime = DateTime.Now;
this.transferdata();
var endTime = DateTime.Now;
var timeElapsed = endTime.Subtract(startTime);

      

enter image description here

when i saw the definition of the DateTime class, there is only precision in milisecond. can anyone tell me why visual studio 2012 has such a high resolution timer? Or is it related to the operating system?

+3


source to share


2 answers


The clock accuracy depends on the operating system. The system clock captures a certain number of times per second and you can only measure whole ticks.

You can test the resolution for a specific computer using the following code:

DateTime t1 = DateTime.Now;
DateTime t2;
while ((t2 = DateTime.Now) == t1) ;
Console.WriteLine(t2 - t1);

      

On my computer, the result is 00:00:00.0156253

, which means the system clock is showing 64 times per second.

(Note: the type DateTime

also has ticks, but this is not the same as the system clock. A DateTime

tick is 1/10000000 seconds.)



To determine the time more precisely, you should use the class Stopwatch

. Its resolution is also system dependent, but much higher than the system clock. You can get the permission from the property Stopwatch.Frequency

and on my computer it returns 2143566

which is just over 64 ...

Start the stopwatch before you start and stop it and then the elapsed time:

Stopwatch time = Stopwatch.StartNew();
this.transferdata();
time.Stop();
TimeSpan timeElapsed = time.Elapsed;

      

This will return the time in resolution that the type can handle TimeSpan

for example. 1/10000000 seconds. You can also calculate the time from the number of ticks:

double timeElapsed = (double)s.ElapsedTicks / (double)Stopwatch.Frequency;

      

+5


source


You are confusing several things. Precision, precision, frequency and resolution.

You can have a variable with a precision of a billion decimal places. But if you cannot measure this small figure, then this is the difference between precision and resolution. Frequency is the number of times per second a measurement is taken and refers to resolution. Accuracy is how closely a given sample matches the actual measurement.



So, given that DateTime has much higher precision than the system clock, simply saying that DateTime.Now won't necessarily give you an exact timestamp. However, Windows has higher resolution timers, and the stopwatch class uses them to measure elapsed time, so if you use this class you get much better accuracy.

DateTime has no "default precision". It only has one precision and the minimum and maximum values ​​it can store. DateTime internally stores its values ​​as a single value, and this value is formatted for whatever type you want to display (seconds, minutes, days, ticks, whatever ...).

+1


source







All Articles