Reputation: 687
I have recently started working on a project in which we are using serverless framework. We are using docker to make dev environment easier to setup.
As part of this docker setup we have created a script that creates S3 buckets & tables among other things. We were earlier defining environment variables in the docker-compose
file and were accessing them in our nodejs app. For purposes of deployment to other environments our devops team defined a few environment variables in the serverless.yaml
file resulting in environment configs being present at two places. We are now planning to move all the environment configs defined in our docker-compose
file to serverless.yaml
. This works well for our lambdas functions as they are able to read these configs, but it doesn't work for the standalone setup script that we have written.
I tried using this plugin(serverless-scriptable-plugin
) in an attempt to be able to read these env variables but still unable to do so.
Here is my serverless.yaml
file
service:
name: my-service
frameworkVersion: '2'
configValidationMode: error
provider:
name: aws
runtime: nodejs14.x
region: 'us-east-1'
profile: ${env:PROFILE, 'dev'}
stackName: stack-${self:provider.profile}
apiName: ${self:custom.environment_prefix}-${self:service.name}-my-api
environment: ${self:custom.environment_variables.${self:provider.profile}}
plugins:
- serverless-webpack
- serverless-scriptable-plugin
- serverless-offline-sqs
- serverless-offline
functions:
myMethod:
handler: handler.myHandler
name: ${self:custom.environment_prefix}-${self:service.name}-myHandler
events:
- sqs:
arn:
Fn::GetAtt:
- MyQueue
- Arn
resources:
Resources:
MyQueue:
Type: AWS::SQS::Queue
Properties:
QueueName: ${self:custom.queuename}
Tags:
- Key: product
Value: common
- Key: service
Value: common
- Key: function
Value: ${self:service.name}
- Key: region
Value: ${env:REGION}
package:
individually: true
custom:
webpack:
webpackConfig: ./webpack.config.js
includeModules: true
serverless-offline:
host: 0.0.0.0
port: 3000
serverless-offline-sqs:
apiVersion: '2012-11-05'
endpoint: http://sqs:9324
region: ${self:provider.region}
accessKeyId: root
secretAccessKey: root
skipCacheInvalidation: false
localstack:
stages:
- local
lambda:
mountCode: true
debug: true
environment_prefixes:
staging: staging
production: production
dev: dev
environment_prefix: ${self:custom.environment_prefixes.${self:provider.profile}}
queuename: 'myQueue'
environment_variables:
dev:
AWS_ACCESS_KEY_ID: test
AWS_SECRET_ACCESS_KEY: test
BUCKET_NAME: my-bucket
S3_URL: http://localstack:4566
SLS_DEBUG: '*'
scriptable:
commands:
setup: node /app/dockerEntrypoint.js
In my DockerFile I try executing script using sls setup
CMD. I initially thought using sls
command might expose these environment variables defined in serverless.yaml
file but it doesn't seem to happen.
Is there any other way this can be achieved? I am trying to access these variables using process.env
which works for lambdas but not for my standalone script. Thanks!
Upvotes: 1
Views: 1315
Reputation: 3777
There's not a good way to get access to these environment variables if you're running the lambda code as a script.
The Serverless Framework injects these variables into the Lambda function runtime configuration via CloudFormation.
It does not insert/update the raw serverless.yml
file, nor does it somehow intercept calls to process.env
via the node process.
You'll could use the scriptable plugin to run after package, and then export each variable into your local docker environment. But that seems pretty heavy for the variables in your env.
Instead, you might consider something like dotenv, which will load variables from a .env
file into your environment.
There is a serverless-dotenv plugin you could use, and then your script could also call dotenv
before running.
Upvotes: 1