vishruti
vishruti

Reputation: 475

How to read blob that are being uploaded to separate folders in the container into Stream Analytics job

I have few IOT Devices in the central application and they are sending telemetry to the Azure blob container. For each blob there is a seperate folder being created into the container(based on the upload time). Following snapshot shows the directories, in a similar war multiple directories/subdirectories are being created to store the blob. enter image description here

How can I read this data into my Stream analytics job. I have a Stream analytics job with blob container as input, even though the container is continuously receiving data but it isn't showing any data when I run the select * query. Please let me know how am I supposed to get blob input into stream analytics where each blob is stored in a separate folder in the container.

Upvotes: 0

Views: 291

Answers (1)

SaiKarri-MT
SaiKarri-MT

Reputation: 1301

Usually if we have large amount of data, pulling them by query will take time.

Try to get the data as below:

SELECT
    BlobName,
    EventProcessedUtcTime,
    BlobLastModifiedUtcTime
FROM Input

You can also specify tokens such as {date}, {time} on the path prefix pattern to help guide Stream Analytics on the files to read.

when the job was running for long enough, there appeared to be some output. And in those output records we can notice that big delay between my custom timestamp field and the general timestamp field

For detail understanding about how to configure the streaming inputs refer to blog

Also, if you want to read blobs from the root of the container, do not set a path pattern. Within the path, you can specify one or more instances of the following three variables: {date}, {time}, or {partition}

Upvotes: 1

Related Questions