Reputation:
I'm working on fluentd setup in kubernetes. In kubernetes I have a number of applications which are writing some logs into stdout. I can filter, parse, and send logs to azure blob storage. But I want the logs from blob storage to be ingested into azure data explorer cluster. In data explorer cluster I have a database and table which has some schema defined, already. The question is how do I modify event from fluentd in such a way that it's going to meet the table schema? Is it possible at all? Maybe there are some alternative ways of creating such setup?
Upvotes: 1
Views: 533
Reputation: 419
Consider using the ability to listen on blobs landing in storage using the event grid mechanism. Check out https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-event-grid-overview
Upvotes: 0
Reputation: 1046
Yes it is possible to do this. You can ingest data stored in your blob to a custom table on azure data explorer. Refer this link
The below is an example where i ingest a JSON document stored in blob to a table in ADX
.ingest into table Events ('https://kustosamplefiles.blob.core.windows.net/jsonsamplefiles/simple.json') with '{"format":"json", "ingestionMappingReference":"FlatEventMapping"}'
If the schema is difficult to parse, i would recomment to ingest first to a raw table(Source Table). Then you can have a update policy to move this data into different tables after parsing. You can check this link to understand about Update policy
Upvotes: 2
Reputation: 7618
Take a look at ingestion mappings, you can pick the properties that you care about and route them to the applicable columns and when a new property arrives you can change the mapping and the table schema will automatically be updated.
Upvotes: 3