Reputation: 219
I was wondering if someone could point me in the right direction, I am looking at verbalising our Kubernetes version within our Terraform module as currently its hard coded.
Every now and then we get an error saying that our "orchestration version" isn't valid when we deploy our cluster - this is due to Azure dropping our Kubernetes version... (We deploy our dev/uat envs daily)
So I came up with the idea of using a variable for our version so we can always deploy avoiding the "invalid orchestration" error. I scripted the bash script for what I need;
az aks get-versions --location westeurope --query 'orchestrators' -o tsv | awk '{print $3}' | tail -2 | head -n 1
But now I wish to use the output from ^^^ to use as our Kubernetes version within our Terraform module we deploy daily.
Can anyone point me in the right direction?
I've had a look at using build-args within the docker container.
Upvotes: 13
Views: 19217
Reputation: 3846
Use external
datasource:
https://registry.terraform.io/providers/hashicorp/external/latest/docs/data-sources/external
The only trouble is that the output and (optional) input must be valid Jsons with some limitiations.
Example:
data "external" "myjson" {
program = [
"${path.module}/scriptWhichReturnsJson.sh",
]
result {
}
}
...
resource ... {
value = data.external.myjson.property
}
Upvotes: 13
Reputation: 1928
You can use Hashicorp's External to do this, but your program can only read JSON stdin
(or doesn't read anything at all), and must give the output as JSON.
Here's my example using External 2.3.1 (latest version as per October 2023):
data.tf:
data "external" "gcloud_auth_token" {
program = [
"sh", "${path.module}/gcloud-token.sh",
]
query = {
# You can pass something to STDIN of your program here,
# but as per current version, the input will be given as JSON (map of string)
}
}
gcloud-token.sh:
# AUTH_TOKEN=$(gcloud auth print-access-token)
AUTH_TOKEN="YourTokenGoesHere"
OUTPUT=$(jq --null-input --arg token "$AUTH_TOKEN" '{"token": $token}')
echo "$OUTPUT"
# This will give output:
# {
# "token": "YourTokenGoesHere"
# }
output.tf:
output "token" {
value = data.external.gcloud_auth_token.result.token
}
Then run:
chmod a+x gcloud-token.sh
terraform init
terraform apply
You will see this output after running terraform apply
:
Apply complete! Resources: 0 added, 0 changed, 0 destroyed.
Outputs:
token = YourTokenGoesHere
Note:
You may need to install jq
on your Mac / Linux / Windows WSL to run this example yourself.
Just run this:
brew install jq
(mac)
or
sudo apt-get install jq
(Linux, WSL. See this)
Upvotes: 1
Reputation: 114
You can easily pass the bash script output to a terraform module through Terraform CLI
kubernetes_version=$([your-bash-command])
terraform [action] -var "tf_kubernetes_version=$kubernetes_version" [some-other-options] [tf-root-directory]
The action
can be plan/destroy/apply
. Visit this for further details.
variable "tf_kubernetes_version" {}
module "kubernetes_module" {
source = "[tf-module-directory]"
md_kubernetes = "${var.tf_kubernetes_version}"
}
variable "md_kubernetes_version" {}
Finally, you can use ${var.md_kubernetes_version} within your terraform module.
Note: You may opt to add default value for the terraform variable declaration in Steps 3 & 5 for a baseline kubernetes version.
Hope this helps!
Upvotes: 0
Reputation: 2030
You can use Command Substitution to assign the result of a command to a variable. I took the liberty of cleaning up your text processing into a single awk
statement. The following Bash should assign the second to last value of the third column of your az
output to the variable kubernetes_version
.
kubernetes_version=$(
az aks get-versions -o tsv \
--location westeurope --query orchestrators \
|awk '{penult=ult; ult=$3} END{printf penult}'
)
This is untested as I don't use Azure; please comment any questions/concerns.
Upvotes: -1
Reputation: 5003
One way to solve it is to store the output of the bash-script into a file, and use the Local provider to read that file:
data "local_file" "foo" {
filename = "${path.module}/foo.bar"
}
and then you can use the file content via ${local_file.foo.content}
.
But I would check what data sources are available in the azure provider. You could use the azurem_kubernetes_cluster data resource to get the kubernetes_version of the currently deployed cluster:
data "azurerm_kubernetes_cluster" "test" {
name = "myakscluster"
resource_group_name = "my-example-resource-group"
}
Which makes the kubernetes version via the kubernetes_version
attribute.
Upvotes: 0