justinnewton987
justinnewton987

Reputation: 1

Can run pyspark.cmd but not pyspark from command prompt

I am trying to get pyspark setup for windows. I have java, python, Hadoop, and spark all setup and environmental variables I believe are setup as I've been instructed elsewhere. In fact, I am able to run this from the command prompt:

pyspark.cmd

And it will load up the pyspark interpreter. However, I should be able to run pyspark unqualified (without the .cmd), and python importing won't work otherwise. It does not matter whether I navigate directly to spark\bin or not, because I do have spark\bin added to the PATH already.

.cmd is listed in my PATHEXT variable, so I don't get why the pyspark command by itself doesn't work.

Thanks for any help.

Upvotes: 0

Views: 369

Answers (1)

justinnewton987
justinnewton987

Reputation: 1

While I still don't know exactly why, I think the issue somehow stemmed out of how I unzipped the spark tar file. Within the spark\bin folder, I was unable to run any .cmd programs without the .cmd extension included. But I could do that in basically any other folder. I redid the unzip and the problem no longer existed.

Upvotes: 0

Related Questions