Reputation: 31
I installed spark and set all the environment variable properly of spark and python as also mentioned in stackoverflow
but still getting this warning on starting spark
20/09/06 13:33:52 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped
Upvotes: 3
Views: 3813
Reputation: 145
I was facing the same issue. The reason is that environment variables are not set up properly. Earlier what I was doing is just clicking on the "Enter" when the warning comes, and the process continued from the next line without any issues. Though it is a temporary solution but it worked.
Another solution which worked for me permanently is to configure environment/PATH variables properly.
Steps:
1. Make sure to have py4j zip file available. My py4j was in the directory,
"C:\Spark\spark-3.0.1-bin-hadoop2.7\python\lib".
2. Now, Go to Environment Variables.
3. Go to PATH in user variables.
4. Click on Edit, and Now add these path as a new.
5. %SPARK_HOME%\bin
6. %SPARK_HOME%\python
7. %SPARK_HOME%\python\lib\py4j-(version number)-src.zip, mine was 0.10.9
so I typed, %SPARK_HOME%\python\lib\py4j-0.10.9-src.zip
8. Finally add, %PYTHONPATH%
Click on OK, and save all changes. Rerun the program it should work properly.
Upvotes: 3