Reputation: 71
I want to run my Python script that produces bacup folder and then push it to GCP Storage. Steps are run python,then push folers into GCP. I added the workspace so that folder could be persisted to workspace. Build jobs passes when I delete the rest,but I need the upload part as well. My config.yml
jobs:
build:
executor: python/default
steps:
- checkout
- run: mkdir -p workspace
- run: echo "Hello, world!" > workspace/echo-output
- python/install-packages:
pkg-manager: pip
- python/install-packages:
pip-dependency-file: requirements.txt
pkg-manager: pip
- python/install-packages:
args: pytest
pkg-manager: pip
pypi-cache: false
- run:
command: |
pytest --version
name: Test
- run:
name: Show all files
command: find "$(pwd)"
- persist_to_workspace:
root: workspace
paths:
- echo-output
use-gcp:
executor: gcp-cli/default
steps:
- gcp-cli/setup:
version: 330.0.0
push-public-folder:
executor: gcp-storage/default
steps:
- attach_workspace:
at: workspace/echo-output
- setup_directquery_cloud_credentials
- gcp-storage/upload:
source_path: echo-output
destination_bucket: backup_folder
cache_control: public, max-age=86400
workflows:
main:
jobs:
- build
- use-gcp
- push-public-folder
fails.
What is wrong with my executor? temp log file form VSCode shows
1. [#/jobs/build] only 1 subschema matches out of 2
| 1. [#/jobs/build] extraneous key [push-public-folder] is not permitted
| | Permitted keys:
| | - description
| | - parallelism
| | - macos
| | - resource_class
| | - docker
| | - steps
| | - working_directory
| | - circleci_ip_ranges
| | - machine
| | - environment
| | - executor
| | - shell
| | - parameters
Upvotes: 0
Views: 24