Reputation: 111
I am trying to load a pre_trained model named "tr_model.h5" for my assignment but I get the following error:
Traceback (most recent call last):
File "Trigger_Project.py", line 84, in <module>
model = load_model(filename)
File "Trigger_Project.py", line 84, in <module>
model = load_model(filename)
File "/home/neeraj/anaconda3/lib/python3.6/site-packages/h5py/_hl/files.py", line 99, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5f.pyx", line 78, in h5py.h5f.open
OSError: Unable to open file (unable to open file: name = 'tr_model.h5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)
I have made sure that the file is present. I don't why it is showing os error. I am using linux 18.04 and all the required libraries are upgraded. Any help is much appreciated.
Upvotes: 11
Views: 93391
Reputation: 53
I came across this issue and get it solved by removing the project in pycharm and recreate a project in the same folder.
Upvotes: 0
Reputation: 2112
I had loss
always equal to inf
, therefore the model file had never been written to disk on callbacks as it had only been written for best cases for loss
.
When I eliminated all NaN values from the dataset, all went smoothly and the model appeared on the file system.
Enabling verbosity in the callback helped to diagnose this.
For details, see the official Keras docs on callbacks.
Upvotes: 0
Reputation: 81
I had the same issue when setting the path for training checkpoints using
tensorflow.keras.callbacks.ModelCheckpoint
I had set my path to:
path = os.path.join(subdir,filename)
using an f string solved the problem:
path = f'{subdir}/{filename}'
So I would also check how you are importing the load_model
,
try tensorflow.keras.models
instead of only keras.models
Upvotes: 4
Reputation: 676
I encountered the same issue in colab after mounting the drive by setting "%cd /gdrive" and when I wrote "model.save('/gdrive/mnist.h5')". Then I realised files can be created and saved in '/gdrive/My Drive/' and not just '/gdrive'. So, it is important to check if you can create files in the path specified.
So, model.save('/gdrive/My Drive/mnist.h5') worked for me.
Upvotes: 0
Reputation: 1
It worked for me.
Upvotes: 0
Reputation: 621
I solved this problem by specifying an absolute path instead. Getting the absolute path of the working folder and then appending the path to the file from the working directory. in my case, the file is in a directory named datasets, so I tried the following code
file_name = os.path.dirname(__file__) +'\\datasets\\test_catvnoncat.h5'
test_dataset = h5py.File(file_name, "r")
Upvotes: 1
Reputation: 1695
If you are working on colab and google drive, mount the drive to the colab using,
# Load the Drive helper and mount
from google.colab import drive
# This will prompt for authorization.
drive.mount('/content/drive')
Upvotes: 1
Reputation: 3544
I encounter the same issue as I posted in my question:
h5py.File(path) doesn't recognize folder path
My initial reasoning is that h5py.File(path) doesn't handle the standard subfolder path as its argument, e.g. load_model("neunet.h5")
has no issue, but load_model("subfolder/neunet.h5")
will give the same error.
In a nutshell, my solution is to simply put whatever .h5 files into the working home folder for in my jupyter notebook, which is the place you create the .ipynb file. You could use print(os.getcwd())
in jupyter notebook to see where is your current working directory.
Upvotes: 6
Reputation: 749
I encountered with this error when I was working with docker image. Since customized docker has different path than root path, I got the same error.
OSError: Unable to open file (unable to open file: name = '', errno = 2, error message = '
No such file or directory
', flags = 0, o_flags = 0)
Basically mismatch of the paths raise this error
Upvotes: 0