Sabarish
Sabarish

Reputation: 141

Azure Message size limit and IOT

I read through azure documentation and found that the message size limit of Queues is 64 Kb and Service Bus is 256 KB. We are trying to develop an application which will read sensor data from the some devices, call a REST Service and upload it to cloud . This data will be stored in the queues and then dumped in to a Cloud database.

There could be chances that the sensor data collected is more than 256 KB... In such cases what is the recommended approach... Do we need to split the data in the REST service and then put chunks of data in the queue or is there any other recommended pattern

Any help is appreciated

Upvotes: 1

Views: 3254

Answers (1)

David Crook
David Crook

Reputation: 2730

You have several conflicting technology statements. I will begin by clarifying a few.

  1. Service Bus/IoT Hub are not post calls. A post call would use a restful service, which exists separately. IoT Hub uses a low latency message passing system that is abstracted from you. These are intended to be high volume small packets and fits most IoT scenarios.

  2. In the situation in which a message is larger than 256 KB (which is very interesting for an IoT scenario, I would be interested to see why those messages are so large), you should ideally upload to blob storage. You can still post packets

    • If you have access to blob storage api's with your devices, you should go that route
    • If you do not have access to this, you should post big packets to a rest endpoint and cross your fingers it makes it or chop it up.

      1. You can run post analytics on blob storage, I would recommend using the wasb prefix as those containers are Hadoop compliant and you can stand up analytics clusters on top of those storage mechanisms.

You have no real need for a queue that I can immediately see.

You should take a look at the patterns leveraging:

  1. Stream Analytics: https://azure.microsoft.com/en-us/services/stream-analytics/
  2. Azure Data Factory: https://azure.microsoft.com/en-us/services/data-factory/

Your typical ingestion will be: Get your data up into the cloud into super cheap storage as easily as possible and then deal with analytics later using clusters you can stand up and tear down on demand. That cheap storage is typically blob and that analytics cluster is usually some form of Hadoop. Using data factory allows you to pipe your data around as you figure out what you are going to use specific components of it for.

Example of having used HBase as ingestion with cheap blob storage as the underlayment and Azure Machine Learning as part of my analytics solution: http://indiedevspot.com/2015/07/09/powering-azureml-with-hadoop-hbase/

Upvotes: 4

Related Questions