Reputation: 1931
I have a python application which should run remotely via an AWS pipeline and use secrets to get parameters such as database credentials. When running the application locally the parameters are loaded from a parameters.json
file. My problem is how to to test I run remotely (so replacing IN_CLOUD_TEST
):
import boto3
from json import load
if [IN_CLOUD_TEST]:
params_raw = boto3.client('ssm').get_parameters_by_path(Path='/', Recursive=True)['Parameters']
params = format_params(params)
else:
with open('parameters.txt') as json_file:
params = load(json_file)
I could of course use a try/except, but there must be something nicer.
Upvotes: 3
Views: 792
Reputation: 558
You could check using AWS APIs, but a simpler alternative (and one that doesn't require making HTTP calls, helping you shave off some latency) is to set an environment variable on your remote server that tells it it's the production server and read it from the code.
import boto3
from json import load
from os import getenv
if getenv('IS_REMOTE', False):
params_raw = boto3.client('ssm').get_parameters_by_path(Path='/', Recursive=True)['Parameters']
params = format_params(params)
else:
with open('parameters.txt') as json_file:
params = load(json_file)
You could also apply the same logic but defining a variable that equals true when your server is supposed to be the testing one, and setting it on your local testing machine.
Upvotes: 3