user11955299
user11955299

Reputation:

FileNotFoundError: [Errno 2] No such file or directory -- even if I'm using the full path

I'm using Python 3 in Google Colab. I keep getting the error "FileNotFoundError: [Errno 2] No such file or directory" even if I'm pretty sure that the path I placed in was correct and verified that the folders are there.

I've used "pip install keras" and "pip install tensorflow" initially. I've tried using \ instead of \ for the path.

# Importing the Keras libraries and packages
from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
# Initialising the CNN
classifier = Sequential()
# Step 1 - Convolution
classifier.add(Conv2D(32, (3, 3), input_shape = (64, 64, 3), activation = 'relu'))
# Step 2 - Pooling
classifier.add(MaxPooling2D(pool_size = (2, 2)))
# Adding a second convolutional layer
classifier.add(Conv2D(32, (3, 3), activation = 'relu'))
classifier.add(MaxPooling2D(pool_size = (2, 2)))
# Step 3 - Flattening
classifier.add(Flatten())
# Step 4 - Full connection
classifier.add(Dense(units = 128, activation = 'relu'))
classifier.add(Dense(units = 1, activation = 'sigmoid'))
# Compiling the CNN
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
# Part 2 - Fitting the CNN to the images
from keras.preprocessing.image import ImageDataGenerator
train_datagen = ImageDataGenerator(rescale = 1./255,
shear_range = 0.2,
zoom_range = 0.2,
horizontal_flip = True)
test_datagen = ImageDataGenerator(rescale = 1./255)
training_set = train_datagen.flow_from_directory('E:\Project\dataset\training_set',
target_size = (64, 64),
batch_size = 32,
class_mode = 'binary')
test_set = test_datagen.flow_from_directory('E:\Project\dataset\test_set',
target_size = (64, 64),
batch_size = 32,
class_mode = 'binary')
classifier.fit_generator(training_set,
steps_per_epoch = 8000,
epochs = 25,
validation_data = test_set,
validation_steps = 2000)
# Part 3 - Making new predictions
import numpy as np
from keras.preprocessing import image
test_image = image.load_img('E:\Project\dataset\single_prediction\cat_or_dog_1.jpg', target_size = (64, 64))
test_image = image.img_to_array(test_image)
test_image = np.expand_dims(test_image, axis = 0)
result = classifier.predict(test_image)
training_set.class_indices
if result[0][0] == 1:
    prediction = 'dog'
else:
    prediction = 'cat'

FileNotFoundError                         Traceback (most recent call last)
<ipython-input-7-757ca18e13d5> in <module>()
     30 target_size = (64, 64),
     31 batch_size = 32,
---> 32 class_mode = 'binary')
     33 test_set = test_datagen.flow_from_directory('E:\Project\dataset\test_set',
     34 target_size = (64, 64),

1 frames
/usr/local/lib/python3.6/dist-packages/keras_preprocessing/image/directory_iterator.py in __init__(self, directory, image_data_generator, target_size, color_mode, classes, class_mode, batch_size, shuffle, seed, data_format, save_to_dir, save_prefix, save_format, follow_links, subset, interpolation, dtype)
    104         if not classes:
    105             classes = []
--> 106             for subdir in sorted(os.listdir(directory)):
    107                 if os.path.isdir(os.path.join(directory, subdir)):
    108                     classes.append(subdir)

FileNotFoundError: [Errno 2] No such file or directory: 'E:\\Project\\dataset\training_set'

Upvotes: 0

Views: 3532

Answers (1)

amrs-tech
amrs-tech

Reputation: 483

I think you cannot directly access files from your local machine in Google colab notebook. You have to upload your files to Colab either through Google Drive or direct Upload from your local computer.

You can refer this link for more details.

Upvotes: 1

Related Questions