jalazbe
jalazbe

Reputation: 2005

databricks-dbx HTTPError 403 Client Error

I am running some jobs using:

I am following the guidelines from : https://dbx.readthedocs.io/en/latest/features/assets/?h=dbx+launch+assets

I run the following commands

dbx deploy <my-workflow> --assets-only
dbx launch <my-workflow> --from-assets

I get the following error:

TypeError: submit_run() got an unexpected keyword argument 'permissions'

On the deployment.yml I have included this:

custom:
  basic-cluster-props: &basic-cluster-props
    spark_version: "10.4.x-scala2.12"
    node_type_id: "Standard_D3_v2"

  basic-settings: &basic-settings
    libraries:
      - pypi:
          package: "pyyaml"
    permissions:
      access_control_list:
        - user_name: "userid" 
          permission_level: "IS_OWNER"
        - group_name: "admins"
          permission_level: "CAN_MANAGE"
        - group_name: "rolename"
          permission_level: "CAN_MANAGE"

  basic-static-cluster: &basic-static-cluster
    new_cluster:
      <<: *basic-cluster-props
      num_workers: 1
    <<: *basic-settings

environments:
  default:
    strict_path_adjustment_policy: true
    workflows:
      - name: "current-integration-test"
        <<:
          - *main-static-cluster
        spark_python_task:
          python_file: "file://tests/integration/cp/silver/test_myjob_job.py"
          parameters: ["--conf-file", "file:fuse://conf/int/cp.yml","--cov=dlite"]

What am I missing here?

Upvotes: 0

Views: 286

Answers (1)

renardeinside
renardeinside

Reputation: 395

there was a bug in dbx which was fixed with the latest 0.7.6 release. Please upgrade to the new version and try again.

Root cause is that permissions section is not acceptable by the RunSubmit API which is used under the hood of assets-based launch. We made it failsafe to improve the customer experience, but please take a note that the permissions setting will not be applied.

Upvotes: 2

Related Questions