close

How to build a sparkSession in Spark 2.0 using pyspark?

Hello Guys, How are you all? Hope You all Are Fine. Today We Are Going To learn about How to build a sparkSession in Spark 2.0 using pyspark in Python. So Here I am Explain to you all the possible Methods here.

Without wasting your time, Let’s start This Article.

Table of Contents

How to build a sparkSession in Spark 2.0 using pyspark?

  1. How to build a sparkSession in Spark 2.0 using pyspark?

    As you can see in the scala example, Spark Session is part of sql module. Similar in python.

  2. build a sparkSession in Spark 2.0 using pyspark

    As you can see in the scala example, Spark Session is part of sql module. Similar in python.

Method 1

from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('abc').getOrCreate()

now to import some .csv file you can use

df=spark.read.csv('filename.csv',header=True)

Method 2

As you can see in the scala example, Spark Session is part of sql module. Similar in python.

class pyspark.sql.SparkSession(sparkContext, jsparkSession=None) The entry point to programming Spark with the Dataset and DataFrame API. A SparkSession can be used create DataFrame, register DataFrame as tables, execute SQL over tables, cache tables, and read parquet files. To create a SparkSession, use the following builder pattern:

>>> spark = SparkSession.builder \
...     .master("local") \
...     .appName("Word Count") \
...     .config("spark.some.config.option", "some-value") \
...     .getOrCreate()

Summery

It’s all About this issue. Hope all Methods helped you a lot. Comment below Your thoughts and your queries. Also, Comment below which Method worked for you? Thank You.

Also, Read