Reputation: 365
I'm using .Net Core 2.2 with Entity Framework. I need to insert some data into SQL Server.
In C# I use DateTime
datatype, and in SQL Server I use datetime2(6)
- with the default value is: (sysdatetime()).
For example my object looks like:
public partial class Contact
{
public int Id { get; set; }
public string Name { get; set; }
public DateTime CreatedTime { get; set; }
}
If I leave the CreatedTime null while creating the object and leave it for SQL Server to use the default (sysdatetime()), I will get something like: 2020-05-19 09:31:12.383593 in the database.
As I understand the DateTime datatype itself already having the microseconds precise, however, If I set a value for that CreatedTime property, for example:
obj.CreatedTime = DateTime.Now.AddTicks(123);
Console.WriteLine(obj.CreatedTime.ToString("yyyy-MM-dd HH:mm:ss.ffffff"));
I can see the value in the console, it would be 2020-08-18 07:42:52.131102 . But when insert that obj through EntityFramwork the value in the Database will be just at milliseconds: 2020-08-18 07:42:52.131000 . It looks like EntityFramework somehow only passing DateTime at milliseconds precise?
Thank you so much for your support.
Upvotes: 2
Views: 866
Reputation: 81503
You may need to force EF to use the SQL Server DateTime2(7)
max/default datatype with 100ns precision (YYYY-MM-DD hh:mm:ss.0000000
), you can do this several ways
Globally
modelBuilder
.Properties<DateTime>()
.Configure(c => c.HasColumnType("datetime2"));
Explicitly
modelBuilder
.Entity<SomeEntiity>()
.Property(f => f.SomeDate)
.HasColumnType("datetime2");
Or via Attributes
[Column(TypeName = "DateTime2(7)")]
Upvotes: 6