close

[Solved] ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for ‘sparse_softmax_cross_entropy_loss

Hello Guys, How are you all? Hope You all Are Fine. Today I get the following error ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for ‘sparse_softmax_cross_entropy_loss in python. So Here I am Explain to you all the possible solutions here.

Without wasting your time, Let’s start This Article to Solve This Error.

How ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for ‘sparse_softmax_cross_entropy_loss Error Occurs?

Today I get the following error ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for ‘sparse_softmax_cross_entropy_loss in python.

How To Solve ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for ‘sparse_softmax_cross_entropy_loss Error ?

  1. How To Solve ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for 'sparse_softmax_cross_entropy_loss Error ?

    To Solve ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for 'sparse_softmax_cross_entropy_loss Error The TensorFlow documentation clearly states that “labels vector must provide a single specific index for the true class for each row of logits”

  2. ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for 'sparse_softmax_cross_entropy_loss

    To Solve ValueError: Can not squeeze dim[1], expected a dimension of 1, got 3 for 'sparse_softmax_cross_entropy_loss Error The TensorFlow documentation clearly states that “labels vector must provide a single specific index for the true class for each row of logits”

Solution 1

The error here is from tf.losses.sparse_softmax_cross_entropy(labels=labels, logits=logits).

The TensorFlow documentation clearly states that “labels vector must provide a single specific index for the true class for each row of logits”. So your labels vector must include only class-indices like 0,1,2 and not their respective one-hot-encodings like [1,0,0], [0,1,0], [0,0,1].

Reproducing the error to explain further:

import numpy as np
import tensorflow as tf

# Create random-array and assign as logits tensor
np.random.seed(12345)
logits = tf.convert_to_tensor(np.random.sample((4,4)))
print logits.get_shape() #[4,4]

# Create random-labels (Assuming only 4 classes)
labels = tf.convert_to_tensor(np.array([2, 2, 0, 1]))

loss_1 = tf.losses.sparse_softmax_cross_entropy(labels, logits)

sess = tf.Session()
sess.run(tf.global_variables_initializer())

print 'Loss: {}'.format(sess.run(loss_1)) # 1.44836854

# Now giving one-hot-encodings in place of class-indices for labels
wrong_labels = tf.convert_to_tensor(np.array([[0,0,1,0], [0,0,1,0], [1,0,0,0],[0,1,0,0]]))
loss_2 = tf.losses.sparse_softmax_cross_entropy(wrong_labels, logits)

# This should give you a similar error as soon as you define it

So try giving class-indices instead of one-hot encodings in your Y_Labels vector. Hope this clears your doubt.

Solution 2

If you used Keras’ ImageDataGenerator, you can add class_mode="sparse" to obtain the correct levels:

train_datagen = keras.preprocessing.image.ImageDataGenerator(
        rescale=1./255,
        shear_range=0.2,
        zoom_range=0.2,
        horizontal_flip=True)
train_generator = train_datagen.flow_from_directory(
        'data/train',
        target_size=(150, 150),
        batch_size=32, 
        class_mode="sparse")

Alternatively, you might be able to use softmax_cross_entropy, which seems to use onehot encoding for the labels.

Summery

It’s all About this issue. Hope all solution helped you a lot. Comment below Your thoughts and your queries. Also, Comment below which solution worked for you? Thank You.

Also, Read