Reputation: 312
I have an Azure Log Analytics workspace and inside it I created a custom table to ingest some of my logs. I used these two guides for it (mainly the first one):
In my logs I have a field:
"Time": "2023-02-07 11:15:23.926060"
Using DCR, I create a field TimeGenerated
like this:
source
| extend TimeGenerated = todatetime(Time)
| project-away Time
Everything works fine, I manage to ingest my data and query it with KQL. The problem is that I can't ingest data with some older timestamp. If timestamp is current time or close to it, it works fine. If my timestamp, let's say from two days ago, it overwrites it with current time.
Example of the log I send:
{
"Time": "2023-02-05 11:15:23.926060",
"Source": "VM03",
"Status": 1
}
The log I receive:
{
"TimeGenerated": "2023-02-07 19:35:23.926060",
"Source": "VM03",
"Status": 1
}
Can you tell why is it happening, why can't I ingest logs from several days ago and how to fix. The guides I used do not mention any of the sort at all, regrettably.
Upvotes: 1
Views: 1490
Reputation: 41
A feature is planned by MS for Q1 2024 (Edit: postponed to Q4 or later) to also store the original/incoming TimeGenerated value. Until that time we have an ingestion time DCR on some tables
source | extend TimeGenerated_CF = TimeGenerated
Edit: Some tables (or at least SecurityEvent) now have a column TimeCollected "The time stamp when the event was collected from the machine"
Upvotes: 0
Reputation: 29770
I've hit this limit once before, a long long time ago. Asked a question and got a response frome someone working on Application Insights and the response was that only data not older than 48h is ingested.
Nowadays AFAIK the same applies to Log Analytics, I am not sure the same limit of 48 hours stills stands but I think it is fair to assume some limit is still enforced and there is no way around it.
Back in the time I took my loss and worked with recent data only.
Upvotes: 1