Reputation: 3608
I have implemented Kafka to receive data from IoT Gateway (small system runs on Linux - which in turn connects another device and sends commands), where in IoT gateway to write data like Log, Device Commands etc. The same will be used for analytics etc on the platform.
Now, I want to send commands from my centralised platform (Server - which is in the cloud - data center) to IoT gateway. Can I use the same Kafka connection or do I need to switch to some other protocol?
Upvotes: 0
Views: 211
Reputation: 10065
Kafka is really great as ingestion system so it fits very well in a IoT scenario like yours for telemetry. When you need to send command in order to control the device, you switch from a publish/subscribe pattern to a request/response pattern where the cloud platform sends a command and waits for a reply from the device (command accepted, executed, ...). Even in this case a message infrastructure is needed for asynchronous needs and temporal decoupling because it's possible that the IoT gateway isn't online when the system sends the command. At same time you don't want that a stale command is executed after a lot of time (when the device comes back online) so a TTL (Time To Leave) is a good feature on message. With Kafka you have an event log where you can store all messages for very long time and re-read the stream : such features are really great for telemetry, maybe not for command & control. In this case a messaging broker system like i.e. ActiveMQ Artemis could be the right choice using a queue for sending command and another for the reply for example. The command will be deleted when consumed or when the TTL expires. Of course it's still possible using Kafka with different topic but the consumer needs to handle the consumed command without never re-read the stream avoiding to execute the command twice or more. Just my 2 cents ...
Upvotes: 1