Reputation: 475
I have few IOT Devices in the central application and they are sending telemetry to the Azure blob container. For each blob there is a seperate folder being created into the container(based on the upload time). Following snapshot shows the directories, in a similar war multiple directories/subdirectories are being created to store the blob.
How can I read this data into my Stream analytics job. I have a Stream analytics job with blob container as input, even though the container is continuously receiving data but it isn't showing any data when I run the select * query. Please let me know how am I supposed to get blob input into stream analytics where each blob is stored in a separate folder in the container.
Upvotes: 0
Views: 291
Reputation: 1301
Usually if we have large amount of data, pulling them by query will take time.
Try to get the data as below:
SELECT
BlobName,
EventProcessedUtcTime,
BlobLastModifiedUtcTime
FROM Input
You can also specify tokens such as {date}, {time} on the path prefix pattern to help guide Stream Analytics on the files to read.
when the job was running for long enough, there appeared to be some output. And in those output records we can notice that big delay between my custom timestamp field and the general timestamp field
For detail understanding about how to configure the streaming inputs refer to blog
Also, if you want to read blobs from the root of the container, do not set a path pattern. Within the path, you can specify one or more instances of the following three variables: {date}, {time}, or {partition}
Upvotes: 1