Ne1zvestnyj
Ne1zvestnyj

Reputation: 1397

How do I run a python script that is located on the AWS EC2 server?

I want to use python code on my computer to run a python script that is located on the server (EC2 ubuntu 18). I understand that you can use boto for this, but I didn't find a full-fledged example where it would be written here is the server, we connect to it like this, we execute the script like this.

Upvotes: -1

Views: 5143

Answers (3)

surplusPolyCount
surplusPolyCount

Reputation: 1

Currently working through this problem, and the above post is correct.

But one thing important to note is that when lambda uses SSM, it connects using a different user. So any modules installed using pip with the session manager console, might not be accessible from lambda's call if it's using a different user.

One thing that really helped me debug, was defining a s3 bucket where stderr and stdout can go. Within the send_command function I set the parameter OutputS3BucketName: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ssm/client/send_command.html

Upvotes: 0

Pranav
Pranav

Reputation: 151

You can do so by using AWS SSM or lambda function.

Refer to @mokugo-devops 's answer for AWS SSM

or Refer to this for lambda function approach

#requires paramiko package
#paramiko package is available at:
# https://github.com/pranavmalaviya2/COVID-19-Live-Data-board/tree/master/lambda%20functions/SSH_lambda-Deployment-package

import json
import boto3
import paramiko
import time

def lambda_handler(event, context):
    # boto3 client
    client = boto3.client('ec2')
    s3_client = boto3.client('s3')
    
    # getting instance information
    describeInstance = client.describe_instances()
    
    # downloading pem file from S3 
    s3_client.download_file('bucket-name','key-name.pem', '/destination/folder/new-key-name.pem')

    # reading pem file and creating key object
    key = paramiko.RSAKey.from_private_key_file("/destination/folder/new-key-name.pem")
    # an instance of the Paramiko.SSHClient
    ssh_client = paramiko.SSHClient()
    # setting policy to connect to unknown host
    ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())

    # connect using ec2 instance ID if requires
    ssh_client.connect(hostname="12.12.12.12", username="ubuntu", pkey=key)

    # command list
    commands = [
        "python script.py",
        "python script2.py",
        "aws s3 cp --recursive source/ s3://destination-bucket/",
    ]

    # executing list of commands within server
    print("Starting execution")
    for command in commands:
        print("Executing command: " + command)
        stdin , stdout, stderr = ssh_client.exec_command(command)
        print(stdout.read())
        print(stderr.read())
    
    print("finished execution")
    
    return {
        'statusCode': 200,
        'body': json.dumps('Execution Completed')
    }

Upvotes: 4

Chris Williams
Chris Williams

Reputation: 35146

Take a look at AWS SSM - Run Command.

From your local Python script you can run the send-command

You can either:

To execute you this, you will need to ensure that the target instance has SSM Agent is intalled, and that instance has a role with the correct privileges. Example command

import boto3
client = boto3.client('ssm')

client.send_command(
    InstanceIds=[
        'i-01234567',
    ],
    DocumentName='AWS-RunShellScript',
    Parameters={
        'commands': [
            'python3 /home/ec2-user/main.py',
        ]
    }
)

Upvotes: 2

Related Questions