Gary
Gary

Reputation: 81

How to uploa local file to the ec2 instance with the module terraform-aws-modules/ec2-instance/aws?

How to upload local file to the ec2 instance with the module terraform-aws-modules/ec2-instance/aws? I placed provisioner inside module "ec2". It does not work. I placed provisioner outsite of the module "ec2". It does not work either. I got the error: "Blocks of type "provisioner" are not expected here".

"provisioner" is inside module "ec2". It does not work.

module "ec2" {
  source = "terraform-aws-modules/ec2-instance/aws"
  version = "4.1.4"
  name = var.ec2_name
  ami  = var.ami
  instance_type = var.instance_type
  availability_zone = var.availability_zone
  subnet_id = data.terraform_remote_state.vpc.outputs.public_subnets[0]
  vpc_security_group_ids = [aws_security_group.sg_WebServerSG.id]
  associate_public_ip_address = true
  key_name = var.key_name
  provisioner "file" {
    source      = "./foo.txt"
    destination = "/home/ec2-user/foo.txt"
    connection {
      type        = "ssh"
      user        = "ec2-user"
      private_key = "${file("./keys.pem")}"
      host        = module.ec2.public_dns
    }
  }
}

"provisioner" is outsite of the module "ec2". It does not work.

module "ec2" {
  source = "terraform-aws-modules/ec2-instance/aws"
  version = "4.1.4"
  name = var.ec2_name
  ami  = var.ami
  instance_type = var.instance_type
  availability_zone = var.availability_zone
  subnet_id = data.terraform_remote_state.vpc.outputs.public_subnets[0]
  vpc_security_group_ids = [aws_security_group.sg_WebServerSG.id]
  associate_public_ip_address = true
  key_name = var.key_name
}

provisioner "file" {
  source      = "./foo.txt"
  destination = "/home/ec2-user/foo.txt"
  connection {
    type        = "ssh"
    user        = "ec2-user"
    private_key = "${file("./keys.pem")}"
    host        = module.ec2.public_dns
  }
}

Upvotes: 0

Views: 1324

Answers (2)

Mark B
Mark B

Reputation: 200657

You can provision files on an EC2 instance with the YAML cloud-init syntax which is passed to the EC2 instance as user-data. Here is an example of passing cloud-init config to EC2.

cloud-init.yaml file:

#cloud-config
# vim: syntax=yaml
#
# This is the configuration syntax that the write_files module
# will know how to understand. Encoding can be given b64 or gzip or (gz+b64).
# The content will be decoded accordingly and then written to the path that is
# provided. 
#
# Note: Content strings here are truncated for example purposes.
write_files:
- content: |
    # Your TXT file content...
    # goes here
  path: /home/ec2-user/foo.txt
  owner: ec2-user:ec2-user
  permissions: '0644'

Terraform file:

module "ec2" {
  source = "terraform-aws-modules/ec2-instance/aws"
  version = "4.1.4"
  name = var.ec2_name
  ami  = var.ami
  instance_type = var.instance_type
  availability_zone = var.availability_zone
  subnet_id = data.terraform_remote_state.vpc.outputs.public_subnets[0]
  vpc_security_group_ids = [aws_security_group.sg_WebServerSG.id]
  associate_public_ip_address = true
  key_name = var.key_name
  user_data = file("./cloud-init.yaml")
}

The benefits of this approach over the approach in the accepted answer are:

  • This method creates the file immediately at instance creation, instead of having to wait for the instance to come up first. The null-provisioner/SSH connection method has to wait for the EC2 instance to be become available, and the timing of that could cause your Terraform workflow to become flaky.

  • This method doesn't require the EC2 instance to be reachable from your local computer that is running Terraform. You could be deploying the EC2 instance to a private subnet behind a load balancer, which would prevent the null-provisioner/SSH connect method from working.

  • This doesn't require you to have the SSH key for the EC2 instance available on your local computer. You might want to only allow AWS SSM connect to your EC2 instance, to keep it more secure than allowing SSH directly from the Internet, and that would prevent the null-provisioner/SSH connect method from working. Further, storing or referencing an SSH private key in your Terraform state adds a risk factor to your overall security profile.

  • This doesn't require the use of a null_resource provisioner, which the Terraform documentation states:

    Important: Use provisioners as a last resort. There are better alternatives for most situations. Refer to Declaring Provisioners for more details.

Upvotes: 1

Hamza AZIZ
Hamza AZIZ

Reputation: 2937

You can use a null ressource to make it work!

resource "null_resource" "this" {
 provisioner "file" {
source      = "./foo.txt"
destination = "/home/ec2-user/foo.txt"
connection {
  type        = "ssh"
  user        = "ec2-user"
  private_key = "${file("./keys.pem")}"
  host        = module.ec2.public_dns
}

}

Upvotes: 2

Related Questions