Downforu
Downforu

Reputation: 349

Kedro run pointing to a previously used Azure Data Lake

I'm trying to read from / write another one of my ADLS Gen2 storage accounts. Until now, it worked perfectly with an old one.

I updated the credentials.yml with the new account name and key but it seems like my catalog is always pointing to my old storage account like shown in the excerpt of the kedro run command's output below :

2021-12-06 15:23:21,735 - azure.core.pipeline.policies.http_logging_policy - INFO - Request URL: 'https://<old_storage_account_name>.blob.core.windows.net....

Any help on how to change source/destination without such issues ?

Thank you in advance.

Upvotes: 0

Views: 244

Answers (2)

Debanjan Banerjee
Debanjan Banerjee

Reputation: 61

Yeah i agree, would be great if you can add some snippets of your execution. But with that we have, i guess one more thing to note in case you are using Kedro on Jupyter is it wont detect the changes unless restart your notebook or you have the following snippet on one of your cells :

%load_ext autoreload

%autoreload 2

Upvotes: 1

datajoely
datajoely

Reputation: 1516

it's a little hard to support from what you've posted. Realistically, Kedro simply reads what's in the YAML files available at run time. So I have a couple of theories:

  1. Is it possible you are pointing to an older version of the codebase?
  2. Do you have multiple credentials.yml files in your codebase which may be taking precedence?

Upvotes: 2

Related Questions