Reputation: 327
Currently I'm storing minutes geo location data from devices and i want to calculate diffrence beetween each one and store it like a minutes distance. After that aggregate it for example hour, 3hours and some other aggregates, every time when a new data come to my main csv file with minutes geo location data.
What I'm currently planning to do is call on specific interval time ( for example 2/3 minutes or triggered by event ) u-sql scripts that read my main geo location data, calculate for each device minutes diffrence and aggregate it on specific file by minutes, hour etc.
But, maybe is there any efficient way to do things what i actually need with azure tools ?
Upvotes: 2
Views: 968
Reputation: 14399
Azure Data Lake Analytics (ADLA) and U-SQL are currently batch only, ie larger jobs for processing big volumes that are measured in minutes and hours, not seconds. You might want to look at Stream Analytics which could be a better fit for your design. You may also wish to look at a lambda architecture pattern which covers both real-time and batch, in which case Azure Data Lake Storage (ADLS) could be the big data store.
Upvotes: 2