close

[Solved] Encountering “ WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark

Hello Guys, How are you all? Hope You all Are Fine. Today I get the following error Encountering “ WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark in Python. So Here I am Explain to you all the possible solutions here.

Without wasting your time, Let’s start This Article to Solve This Error.

How Encountering “ WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark Error Occurs?

Today I get the following error Encountering “ WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark in Python.

How To Solve Encountering “ WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark Error ?

  1. How To Solve Encountering “ WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark Error ?

    To Solve Encountering “ WARN ProcfsMetricsGetter: Exception when trying to compute pagesize” error when running Spark Error The same problem occured with me because python path was not added to system environment. I added this in environment and now it works perfectly.

Solution 1

The same problem occured with me because python path was not added to system environment. I added this in environment and now it works perfectly.

Adding PYTHONPATH environment variable with value as:

%SPARK_HOME%\python;%SPARK_HOME%\python\lib\py4j-<version>-src.zip;%PYTHONPATH%

helped resolve this issue. Just check what py4j version you have in your spark/python/lib folder.

Solution 2

I received this same message, running Spark 3.0.1 on Windows 10, using Scala 2.12.10. It’s not actually an Error, in the sense that it ends your program execution. Its a warning related to /proc file systems on Linux machines.

If you are also on a Windows machine, the answer maybe, to quote Wing Yew Poon @ Apache: “The warning happened because the command “getconf PAGESIZE” was run and it is not a valid command on Windows so an exception was caught.” (From the Spark jira issue here).

If your program failed right after throwing this Exception message, it is for some other reason. In my case, Spark was crashing with this message right after this warning:

20/11/13 12:41:51 ERROR MicroBatchExecution: Query [id = 32320bc7-d7ba-49b4-8a56-1166a4f2d6db, runId = d7cc93c2-41ef-4765-aecd-9cd453c25905] terminated with error
org.apache.spark.SparkException: Job 1 cancelled because SparkContext was shut down

This warning can be hidden by setting spark.executor.processTreeMetrics.enabled to false. To quote Mr. Poon again, “it is a minor bug that you see this warning. But it can be safely ignored.”

Summery

It’s all About this issue. Hope all solution helped you a lot. Comment below Your thoughts and your queries. Also, Comment below which solution worked for you? Thank You.

Also Read

Leave a Comment