Firuze Soltani
Firuze Soltani

Reputation: 1

Tensorflow: Too many values to unpack error when loading data

I am learning machine learning and I am trying to do image classification. I am at the point to define my train and test data from the directory, but I am getting an error:

ValueError: too many values to unpack (expected 2).

Can anyone help me fix this issue?

import tensorflow as tf
from tensorflow.keras import datasets, layers, models
import matplotlib.pyplot as plt
import pathlib

data_dir = "C:/Users/efsol/OneDrive/Classification"
data_dir = pathlib.Path(data_dir)
image_count = len(list(data_dir.glob('*/*.jpg')))
print(image_count)
data_dir.is_dir()

(train_images, train_labels), (test_images, test_labels) = tf.keras.preprocessing.image_dataset_from_directory(data_dir)

# Normalize pixel values to be between 0 and 1
train_images, test_images = train_images / 255.0, test_images / 255.0

Upvotes: 0

Views: 615

Answers (1)

darth baba
darth baba

Reputation: 1398

The tf.keras.preprocessing.image_dataset_from_directory generates tf.data.Dataset and not of the form (train_images, train_labels) hence the ValueError: too many values to unpack (expected 2) to fix this:

# Train split
train_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir,
  validation_split=0.2,
  subset="training",
  )

# Validation/test split
val_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir,
  validation_split=0.2,
  subset="validation",
  )

# This will print all the classes
class_names = train_ds.class_names
print(class_names)

# you can use tf.keras.layers.Rescaling to Normalize pixel values to be between 0 and 1
normalization_layer = layers.Rescaling(1./255)
normalized_ds = train_ds.map(lambda x, y: (normalization_layer(x), y))
image_batch, labels_batch = next(iter(normalized_ds))
first_image = image_batch[0]
print(np.min(first_image), np.max(first_image)) 

please refer to this tutorial for complete code

Upvotes: 0

Related Questions