close

how to check if a string column in pyspark dataframe is all numeric

Hello Guys, How are you all? Hope You all Are Fine. Today We Are Going To learn about how to check if a string column in pyspark dataframe is all numeric in Python. So Here I am Explain to you all the possible Methods here.

Without wasting your time, Let’s start This Article.

Table of Contents

how to check if a string column in pyspark dataframe is all numeric?

  1. how to check if a string column in pyspark dataframe is all numeric?

    I want the whole table to be filtered out. PFB
    df2.filter(F.col("id").cast("int").isNotNull()).show()

  2. check if a string column in pyspark dataframe is all numeric

    I want the whole table to be filtered out. PFB
    df2.filter(F.col("id").cast("int").isNotNull()).show()

Method 1

A simple cast would do the job :

from pyspark.sql import functions as F

my_df.select(
  "ID",
  F.col("ID").cast("int").isNotNull().alias("Value ")
).show()

+-----+------+
|   ID|Value |
+-----+------+
|25q36| false|
|75647|  true|
|13864|  true|
|8758K| false|
|07645|  true|
+-----+------+

Method 2

I want the whole table to be filtered out. PFB

df2.filter(F.col("id").cast("int").isNotNull()).show()

Also there is no need to create a new column called Values


Alternative solution similar to above is –

display(df2.filter(f"CAST({'id'} as INT) IS NOT NULL")

Summery

It’s all About this issue. Hope all Methods helped you a lot. Comment below Your thoughts and your queries. Also, Comment below which Method worked for you? Thank You.

Also, Read