karatuno1
karatuno1

Reputation: 11

Secure way to upload files to GCP Cloud Storage

We're making some machines in which there's a part which uploads the images captured by the camera to Google Cloud Storage. For this purpose what I've done is

  1. Create a service account for each machine.
  2. Create a custom role with permissions:
storage.objects.create
storage.buckets.get
storage.objects.get
  1. Apply this role to that service account.
  2. Download the JSON credentials key file and use this file with python script (in which I specify bucket name) to upload image to GCP Storage.

Is this way of doing things efficient and secure given that we only ship 2-3 machines each month?

Also I will have to ship JSON file with each machine, if the above method is valid, is this fine or there's any method to hide this key file?

Upvotes: 0

Views: 1691

Answers (1)

guillaume blaquiere
guillaume blaquiere

Reputation: 75715

Your case isn't so simple!

  • Firstly, if you want to put a service account in each machine, you will be limited a day (you are limited to 100 service accounts per project). And using the same service account, or the same key is too dangerous
  • Secondly, your use case sounds like IoT use case where you have lot of devices on edge to communicate with the cloud. But PubSub messages are limited to 10Mb max and IoT Core solution doesn't fit with your case.
  • The 2 latest solutions are based on the same principle:
    • Make an endpoint public (Cloud Run, Cloud Functions, App Engine or whatever you want)
    • Call this endpoint with your machine, and their own token (i.e. a string, encrypted or not)
    • Check the token, if OK you can (here the 2 alternatives)
  1. Create an access token (short lived token) on a service account with the minimal permission for the machine usage, and send it back to the machine. The machine will use it to call the Google Cloud API, such as Cloud Storage API. The advantage of this solution is that you will be able to use the access token to reach other GCP APIs in the future if your use case, and your machine update require them.
  2. Create a signedUrl and send it back to the machine. Then the machine has to upload file to this URL. The advantage is the strict limitation to Cloud Storage, no other GCP service.

The main issue with the 2 latest solution is that required public endpoint and you are exposed to attacks on it. You can protect it behind a load balancer and mitigate the attacks with Cloud Armor. Think also to limit the scalability of your public endpoint, to prevent any useless expenses in case of attacks.

Upvotes: 2

Related Questions