SebastianG
SebastianG

Reputation: 9594

terraform GCP cloud function without having to deploy via terraform in CI or breaking past deployments when running locally?

I have some working terraform definitions among a larger project:

resource "google_storage_bucket" "owlee_functions_bucket" {
  name     = "owlee_functions_bucket"
  location = "europe-west2"
  project  = "owlee-software"
}

resource "google_storage_bucket_object" "archive" {
  name   = "index.zip"
  bucket = google_storage_bucket.owlee_functions_bucket.name
  source = "../apps/backend/dist/index.zip"
}

resource "google_cloudfunctions_function" "backend_function" {
  name    = "backend_function"
  runtime = "nodejs16"
  project = "owlee-software"
  region  = "europe-west2"

  available_memory_mb   = 128
  source_archive_bucket = google_storage_bucket.owlee_functions_bucket.name
  source_archive_object = google_storage_bucket_object.archive.name
  trigger_http          = true
  entry_point           = "OWLEE"
}

Then I'm trying to deploy via CI, for now, I'm just running terraform apply after zipping up the new version of the function to handle deployment.

It's not great and I'd like to change that to a non-terraform process ideally but that doesn't seem to be documented/possible anywhere which makes me think I have the wrong approach with this.

The second issue which is more urgent to solve --

I want to continue managing my infrastructure locally for now and do not want to have to zip up a new version of the function to deploy everytime I have to run terraform apply locally.

Is there a way -- after its creation -- to avoid overwriting/uploading the function via terraform?

I'm guessing this would be somewhat necessary for the CI deployment to work anyway.

I've looked at a handful of other SO threads but they were looking at specifics around cloud-build and the artifacts registry.

Upvotes: 4

Views: 1085

Answers (3)

intotecho
intotecho

Reputation: 5684

Does it make sense to have a cloud function with no code?

I think it is a great request and I am also looking to use Terraform to deploy a function and set up its IAM, memory, vpc_connector settings, region, labels and configure a special service account. These are all CI concerns. But the code is built and deployed from a separate CD pipeline. Parameters like runtime, trigger type and entrypoint could be handled by the CI or the CD. But then the issue is how to ensure CI does not override CD. Parameters like environment_variables could be merged. Perhaps if they are not specified they don't get overridden to meet your other requirement?

This does not seem to be possible. I think its a limitation of the cloud functions api not Terraform. If you don't specify any source variables in the terraform resource, the apply will fail with "GCS URI gs:/// does not match the expected pattern gs://{bucket}/{path}". If the supplied code does not meet minimum requirements the deploy will fail and not create the function. Deploying a hello world function from CI risks breaking the CD if it overrides the intended version.

The fallback seems to be that the CI creates the service account, VPC connector and permissions and enables the APIs but the CD pipeline is in control of creating, building and deploying the function with gcloud deploy.

Upvotes: 0

dogmatic69
dogmatic69

Reputation: 7585

Instead of using a fixed name as you are, use a random string or depending on needs the commit hash for example. This can be prefixed with other things to make it even more unique.

resource "random_string" "function" {
  length           = 8
  special          = false
  
  keepers = {
    commit_hash = var.commit_hash,
    environment = var.environment,
  }
}

resource "google_storage_bucket_object" "archive" {
  name   = "index.zip"
  bucket = google_storage_bucket.owlee_functions_bucket.name
  source = "../apps/backend/dist/${random_string.function.result}.zip"
}

resource "google_cloudfunctions_function" "backend_function" {
  name    = "backend_function"
  runtime = "nodejs16"
  project = "owlee-software"
  region  = "europe-west2"

  available_memory_mb   = 128
  source_archive_bucket = google_storage_bucket.owlee_functions_bucket.name
  source_archive_object = google_storage_bucket_object.archive.name
  trigger_http          = true
  entry_point           = "OWLEE"
}

This way if you provide an environmnet such as prod and the same commit hash every time, it will create the same zip file.

If you provide a new environment, say "local", it will generate a new zip. You can then create multiple instances of functions or make more changes to the google_cloudfunctions_function so that it can be used with workspaces

Upvotes: 0

SergiCarbajosa
SergiCarbajosa

Reputation: 11

I recommend that you deploy the cloud function by terraform but that the CI of the cloud function is maintained by a cloud build (also created by terraform) I think this is the most logical solution since terraform manages the infrastructure not the implementation of the cloud function.

Upvotes: 0

Related Questions