close

[Solved] No module name pyspark

Hello Guys, How are you all? Hope You all Are Fine. Today I get the following error No module name pyspark in python. So Here I am Explain to you all the possible solutions here.

Without wasting your time, Let’s start This Article to Solve This Error.

How No module name pyspark Error Occurs?

Today I get the following error No module name pyspark in python.

How To Solve No module name pyspark Error ?

  1. How To Solve No module name pyspark Error ?

    To Solve No module name pyspark Error You don't have pyspark installed in a place available to the python installation you're using. To confirm this, on your command line terminal, with your virtualenv activated, enter your REPL (python) and type import pyspark:

  2. No module name pyspark

    To Solve No module name pyspark Error You don't have pyspark installed in a place available to the python installation you're using. To confirm this, on your command line terminal, with your virtualenv activated, enter your REPL (python) and type import pyspark:

Solution 1

You don’t have pyspark installed in a place available to the python installation you’re using. To confirm this, on your command line terminal, with your virtualenv activated, enter your REPL (python) and type import pyspark:

$ python
Python 3.5.0 (default, Dec  3 2015, 09:58:14) 
[GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.1.76)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyspark
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ImportError: No module named 'pyspark'

If you see the No module name 'pyspark' ImportError you need to install that library. Quit the REPL and type:

pip install pyspark

Then re-enter the repl to confirm it works:

$ python
Python 3.5.0 (default, Dec  3 2015, 09:58:14) 
[GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.1.76)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyspark
>>>

As a note, it is critical your virtual environment is activated. When in the directory of your virtual environment:

$ source bin/activate

These instructions are for a unix-based machine, and will vary for Windows.

Solution 2

You can use findspark to make spark accessible at run time. Typically findspark will find the directory where you have installed spark, but if it is installed in a non-standard location, you can point it to the correct directory. Once you have installed findspark, if spark is installed at /path/to/spark_home just put

import findspark
findspark.init('/path/to/spark_home')

at the very top of your script/notebook and you should now be able to access the pyspark module.

Summery

It’s all About this issue. Hope all solution helped you a lot. Comment below Your thoughts and your queries. Also, Comment below which solution worked for you? Thank You.

Also, Read