Reputation: 9024
I am running Serverless Offline and have configured environment variables based on stages.
provider:
name: aws
stage: ${opt:stage,'dev'}
environment:
MY_ENV_VAR: ${self:custom.myEnvVar.${self:provider.stage}}
custom:
stages:
- dev
- prod
myEnvVar:
dev: ${env:MY_ENV_VAR}
prod: ${ssm:MY_ENV_VAR}
When I run serverless offline start
, It is throwing warning about
A valid SSM parameter to satisfy the declaration 'ssm:MY_ENV_VAR' could not be found.
by default, the stage is dev
so why it is trying to access the SSM paramter store ?
Any help will be appreciated
Thanks
Upvotes: 2
Views: 7021
Reputation: 787
I'm having the same issue using the latest version of serverless (2.70.0).
I also have SSM parameters referenced as per-stage custom variables.
Serverless tries to fetch values for ones it does not need to (based on the stage). And so that fails. Because it does not have permission to fetch them.
For example with the old variables, it would throw a warning (still wrong, but at least it runs):
sls invoke local -f app -s local -p test/events/test.json
#variablesResolutionMode: 20210326
Serverless Warning --------------------------------------
A valid SSM parameter to satisfy the declaration 'ssm:production-variable-here' could not be found.
And with the new variable resolution ...
variablesResolutionMode: 20210326
... you get an error:
Cannot resolve variable at "custom.production-variable-here": User: arn:aws:iam::etc is not authorized to perform: ssm:GetParameter on resource: arn:aws:ssm:etc because no identity-based policy allows the ssm:GetParameter action
The error is correct: the user is not permitted to get the value. But it has no need to get the value. Since the SSM variable is not needed for the stage.
Upvotes: 2
Reputation: 1897
According to your YAML file, the first concern is the spacing/padding of the myEnvVar variable under custom.
When I tried the sls print I got the same error, the problem is that sls try to validate/read the variable from SSM. Thus if it doesn't exist the sls built fails no matter for which stage it was executed.
The following script properly works for me:
provider:
name: aws
runtime: python3.8
environment:
MY_ENV_VAR: ${self:custom.myEnvVar.${self:provider.stage}}
region: eu-west-1
custom:
myEnvVar:
dev: 'some value'
prod: ${ssm:some-custom-config}
But take into consideration:
Upvotes: 0