Reputation: 31
I'm setting up the databricks setup from local machine while after all configuration I receive following error with
databricks-connect test
Traceback (most recent call last): File "c:\programdata\anaconda3\lib\runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "c:\programdata\anaconda3\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\ProgramData\Anaconda3\Scripts\databricks-connect.exe__main__.py", line 9, in File "c:\programdata\anaconda3\lib\site-packages\pyspark\databricks_connect.py", line 262, in main test() File "c:\programdata\anaconda3\lib\site-packages\pyspark\databricks_connect.py", line 186, in test spark_home = os.path.dirname(pyspark.file) TypeError: expected str, bytes or os.PathLike object, not NoneType
Upvotes: 3
Views: 1083
Reputation: 1
Check your Spark_Home , If SPARK_HOME is set to a version of Spark other than the one in the client, you should unset the SPARK_HOME variable and try again.
Check your IDE environment variable settings, your .bashrc, .zshrc, or .bash_profile file, and anywhere else environment variables might be set. You will most likely have to quit and restart your IDE to purge the old state, and you may even need to create a new project if the problem persists.
You should not need to set SPARK_HOME to a new value; unsetting it should be sufficient
Upvotes: 0
Reputation: 54
It looks like you are missing the configuration, which can be easily done via
databricks-connect configure
You will need the following parameters:
URL: A URL of the form https://.cloud.databricks.com.
User token: A user token.
Cluster ID: The ID of the cluster you created. You can obtain the cluster ID from the URL. Here the cluster ID is 0304-201045-xxxxxxxx.
For more info, please see https://docs.databricks.com/dev-tools/databricks-connect.html
Upvotes: 0