Max
Max

Reputation: 224

Terraform EC2 Instance is not available for SSH after succesful TF apply (unless SSH configs)

I am having this code built for example purposes, but however I configure it, the EC2 instance does not become SSHable via the settings provided by EC2 itself albeit it says it should be able to connect. Using tf v0.13.1

This is my TF code:

resource "aws_instance" "live" {
  ami = "ami-060e472760062f83f"
  instance_type = "t2.nano"
  key_name = "xxx"

  subnet_id = aws_subnet.default.id
  vpc_security_group_ids =  [
    aws_security_group.http-group.id,
    aws_security_group.https-group.id,
    aws_security_group.ssh-group.id,
    aws_security_group.all-outbound-traffic.id,
  ]

  depends_on = [
    aws_internet_gateway.gw,
    aws_network_interface.default
  ]
}

Networking setup:

resource "aws_security_group" "https-group" {
  name        = "https-access-group"
  description = "Allow traffic on port 443 (HTTPS)"

  tags = {
    Name = "HTTPS Inbound Traffic Security Group"
  }

  ingress {
    from_port = 443
    to_port   = 443
    protocol  = "tcp"
    cidr_blocks = ["0.0.0.0/0"]

  }
}


resource "aws_security_group" "http-group" {
  name        = "http-access-group"
  description = "Allow traffic on port 80 (HTTP)"

  tags = {
    Name = "HTTP Inbound Traffic Security Group"
  }

  ingress {
    from_port = 80
    to_port   = 80
    protocol  = "tcp"
    cidr_blocks = ["0.0.0.0/0"]

  }
}

resource "aws_security_group" "all-outbound-traffic" {
  name        = "all-outbound-traffic-group"
  description = "Allow traffic to leave the AWS instance"

  tags = {
    Name = "Outbound Traffic Security Group"
  }

  egress {
    from_port = 0
    to_port   = 0
    protocol  = "-1"
    cidr_blocks = ["0.0.0.0/0"]
  }
}

resource "aws_security_group" "ssh-group" {
  name        = "ssh-access-group"
  description = "Allow traffic to port 22 (SSH)"

  tags = {
    Name = "SSH Access Security Group"
  }

  ingress {
    description = "SSH to VPC"
    from_port   = 22
    to_port     = 22
    protocol    = "tcp"
    cidr_blocks = ["0.0.0.0/0"]
  }

  egress {
    from_port = 0
    to_port = 0
    protocol = "-1"
    cidr_blocks = ["0.0.0.0/0"]
  }
}


resource "aws_vpc" "default" {
  cidr_block = "10.0.0.0/16"
  enable_dns_support   = true
}

resource "aws_subnet" "default" {
  vpc_id = aws_vpc.default.id
  cidr_block = "10.0.10.0/24"
}

resource "aws_internet_gateway" "gw" {
  vpc_id = aws_vpc.default.id
}


resource "aws_eip_association" "eip_assoc" {
  depends_on = [
    aws_instance.live,
    aws_eip.lb
  ]
  instance_id = aws_instance.live.id
  allocation_id = aws_eip.lb.id
  #network_interface_id = aws_network_interface.multi-ip.id
}

resource "aws_eip" "lb" {
  vpc = true
  instance = aws_instance.live.id

  depends_on = [
    aws_instance.live
  ]
}

resource "aws_network_interface" "default" {
  subnet_id       = aws_subnet.default.id
  security_groups = [
    aws_security_group.all-outbound-traffic.id,
    aws_security_group.http-group.id,
    aws_security_group.https-group.id,
    aws_security_group.ssh-group.id
  ]
}

And Route53 Setup:

    resource "aws_route53_zone" "default" {
      name = "xxx.io."

      vpc {
        vpc_id = aws_vpc.default.id
      }
    }

    resource "aws_route53_record" "x-www" {
      zone_id = aws_route53_zone.default.zone_id
      name = "www.xxx.io"
      type = "A"
      ttl = "300"
      records = [aws_eip.lb.public_ip]
    }

    resource "aws_route53_record" "x-sub-www" {
      zone_id = aws_route53_zone.default.zone_id
      name = "*.xxx.io"
      type = "A"
      ttl = "300"
      records = [aws_eip.lb.public_ip]
    }

In AWS console, everything seems connected well together, however the local exec that is in the tf as well is not able to SSH into the EC2 instance and neither am I. What I am doing wrong?

But the instance still remains uncallable via SSH. The Instances are getting built correctly tho and in EC2 Console it all seems to be setup correctly:

enter image description here

Security Groups are set correctly to EC2 Instance:

enter image description here

Upvotes: 0

Views: 351

Answers (1)

Saurabh Nigam
Saurabh Nigam

Reputation: 823

You have created the security group which is correct but not attached to your ec2 instance hence never opening the port 22 on that instance.

use this :

resource "aws_network_interface_sg_attachment" "sg_attachment" {
  security_group_id    = aws_security_group.ssh-group.id
  network_interface_id = aws_instance.live.primary_network_interface_id
}

Similarly attach other security groups too...

Upvotes: 3

Related Questions