Reputation: 1651
It would be cool if I could run a workflow that looks like the following - maybe I'm just missing a simple configuration in GitHub actions but I don't know how to share a workspace between jobs, while using job.needs
to specify which jobs can run when others have completed successfully.
name: Node CI
on: [push]
env:
CI: true
jobs:
install:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [12.x]
steps:
- uses: actions/checkout@v1
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v1
with:
node-version: ${{ matrix.node-version }}
- name: install node_modules
run: yarn install
lint:
runs-on: ubuntu-latest
needs: [install]
steps:
- name: eslint
run: yarn lint
build:
needs: [install]
runs-on: ubuntu-latest
steps:
- name: yarn build
run: yarn build
test:
needs: [install, build]
runs-on: ubuntu-latest
steps:
- name: jest
run: yarn test --coverage
I have read Github actions share workspace/artifacts between jobs? but I'd rather not have to upload node_modules
and download for every step.
Upvotes: 5
Views: 2006
Reputation: 42220
As far as I know, the actions workspace is only shared between steps of the same job. You cannot share the filesystem between jobs.
Uploading/downloading artifacts between jobs is one solution. You also could try the new actions/cache
action to cache the node_modules
directory and restore it on subsequent jobs.
- uses: actions/cache@v1
with:
path: node_modules
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
Note that there are currently some fairly strict limits so it may not work if you have a very large node_modules
directory.
Individual caches are limited to 400MB and a repository can have up to 2GB of caches. Once the 2GB limit is reached, older caches will be evicted based on when the cache was last accessed. Caches that are not accessed within the last week will also be evicted.
Upvotes: 2