saikumar
saikumar

Reputation: 179

How to send sensor data (like temperature data from DHT11 sensor) to Google Cloud IoT Core and store it

I am working on connecting a Raspberry Pi (3B+) to Google Cloud and send sensor's data to Google IoT Core. But I couldn't find any content in this matter. I will be so thankful, if anyone would help me, in dealing with the same.

PS: I have already followed the interactive tutorial from Google Cloud itself and connected a simulated virtual device to Cloud and sent data. I am really looking for a tutorial, that helps me in connecting physical Raspberry Pi.

Thank you

Upvotes: 0

Views: 954

Answers (3)

class
class

Reputation: 8681

You may want to try following along with this community article covering pretty much exactly what you're asking.

The article covers the following steps:

  • Creating a registry for your gateway device (the Raspberry Pi)
  • Adding a temperature / humidity sensor
  • Adding a light
  • Connecting the devices to Cloud IoT Core
  • Sending the data from the sensors to Google Cloud
  • Pulling the data back using PubSub

Upvotes: 1

Dhruv Ahuja
Dhruv Ahuja

Reputation: 58

For reference: https://cloud.google.com/dataflow/docs/guides/templates/provided-streaming

This link will guide you how to deploy google provided dataflow template from pubsub to big query.

For your own custom pipeline, you can take help fron the github link of pipeline code.

Upvotes: 0

Dhruv Ahuja
Dhruv Ahuja

Reputation: 58

Create a Registry in Google Cloud IoT Core and setup devices and their public/private key pairs.

You will also have to setup PubSub topics for publishing device telemetry and state events while creating IoT Core Registries.

Once that is done, you can create a Streaming pipeline in Cloud Dataflow that will read data from the pubsub subscriber and sink it in Big Query (Relational Data Warehouse) or Big Table (No-SQL Data Warehouse).

Dataflow is managed service of Apache Beam where you can create and deploy pipelines written in JAVA or Python.

If you are not familiar with coding, you can use Data Fusion that will help you write your ETL's using drag and drop functionalities similar to Talend. You can create Data Fusion instance in order to create Streaming ETL pipeline. The source will be pubsub and sink will be Big Query or Big Table based on your use case.

Upvotes: 0

Related Questions