Manube
Manube

Reputation: 5242

I cannot just deploy a function with Serverless-framework 1.20.2

I wanted to follow these tips

and just redeploy my function, as the serverless.yml had not been changed.

However, it just hangs on the Serverless: Uploading function stage. Forever, apparently.

The whole deploy (with sls deploy) works, though slowly.

How can debug this, as there is apparently no error message?

EDIT

When I use sls deploy my project takes about 4 min and 15s to deploy.

It seems rather long to me, so I thought I would use sls deploy function -f myFunction instead, which is supposed to be much faster.

However, when I try sls deploy function -f myFunction, it seems to just hang forever on Serverless: Uploading function: myFunction.

I have no idea how to debug that.

It seems using 'verbose', with Serverless: Uploading function: myFunction --verbose does not make a difference, the messages returned are the same.

I will try to wait and see if, eventually, the function deploy completes...

Well, I waited, and it doesn't: after about 8 min 30s I get the following error message:

Serverless Error --------------------------------------- Connection timed out after 120000ms Get Support -------------------------------------------- Docs: docs.serverless.com Bugs: github.com/serverless/serverless/issues Forums: forum.serverless.com Chat: gitter.im/serverless/serverless Your Environment Information ----------------------------- OS: linux Node Version: 7.10.0 Serverless Version: 1.20.2

Another oddity: when hanging, it reads:

Serverless: Uploading function: myFunction (12.05 MB)...

But the function itself is just 3.2 kB, and does not include any packages.

When I use sls deploy, the size displayed is the same:

Serverless: Uploading service .zip file to S3 (12.05 MB)...

What could be wrong with my function deploy?

EDIT 2

As @dashmug hinted, there is a config issue in serverless.yml.

In the functions dir of my serverless project, I would like to have a common package.json and node_modules. Then each function could import modules as needed.

I tried to follow the official guide.

My serverless.yml is like so:

functions:
  myFunction:
    package:
      exclude:
        - 'functions/node_modules/**'
        - '!functions/node_modules/module1_I_want_to_include/**'
        - '!functions/node_modules/module2_I_want_to_include/**'

Now I get, with sls deploy:

Serverless: Uploading service .zip file to S3 (31.02 MB)...

and the function works :)

However, with sls deploy function -f myFunction, I get:

Serverless: Uploading function: dispatch (1.65 MB)...

It does upload in a reasonable time, but the function now gives the following error:

Unable to import module 'functions/myFunction': Error

Upvotes: 14

Views: 5070

Answers (3)

Manube
Manube

Reputation: 5242

I wasn't able to figure out why function deployment (as opposed to service deployment) would hang. I may have misconfigured my serverless.yml file.

But no big deal: I can do without sls deploy function -myFunction.

Because my expectations were wrong. I thought deploying a function would be way faster than deploying a service, by somehow not redeploying the node_modules directory.

But there is no partial function deployment in AWS: when a function is deployed, all necessary node modules must be deployed as well for the function to work.

As explained in serverless doc:

The Framework packages up the targeted AWS Lambda Function into a zip file.

The Framework fetches the hash of the already uploaded function .zip file and compares it to the local .zip file hash.

The Framework terminates if both hashes are the same.

That zip file is uploaded to your S3 bucket using the same name as the previous function, which the CloudFormation stack is pointing to.

I had (naively) hoped that only the updated handler would be uploaded to S3. But as the function is packaged before deployment, it does need all of its modules and dependencies.

So the way I see it, function deployment would save time (as opposed to service deployment) only if the service has multiple functions, and the service functions do not use many common nodejs modules. And if sls deploy function -f myFunction does not hang, that is :)


So to increase development speed, the trick is to use offline emulation with a tool like serverless offline

serverless offline provides a local server, and lambda function myFunction becomes accessible locally, by calling http://localhost:3000/myFunction in Postman or the browser

In most cases, sls deploy can be called only once, after the handler has been thoroughly tested offline.

Upvotes: 2

Noel Llevares
Noel Llevares

Reputation: 16037

Things I would look at:

  1. Try comparing what happens between the two:

    $ SLS_DEBUG=true sls deploy --verbose

    and

    $ SLS_DEBUG=true sls deploy function -f myFunction --verbose

  2. Check your serverless config (packaging, etc.) against your project structure. One red flag is that the function deploy is as big as the service deploy. This could be a misconfiguration problem.

  3. Use serverless package to see how the package(s) are zipped. It can provide some clues.

  4. Are you using any plugins which may have altered the way your package is created?

  5. How many node_modules directory do you have? Do you have only one for the entire service or one for each function?

Upvotes: 9

JustDanyul
JustDanyul

Reputation: 14044

You can make the deploy process more verbose by passing the --verbose argument to the deploy function.

Either sls deploy --verbose or sls deploy -v will do the trick.

Upvotes: 2

Related Questions