Reputation: 1
I've been using Google Colab for my python scripts and it is quite comfortable, so far with "just code"
I would like to start dealing with images, I am looking a script that goes through a hundred images or more, and not sure what is acceptable approach of doing it in Colab as "there is no local drive"
Original code:
import cv2 as cv
import glob
def get_imgs(path_to_imgs):
all_files = glob.glob(path_to_imgs + "**/*.png")
return ( cv.imread(file, flags=cv.IMREAD_GRAYSCALE) for file in all_files)
def show_img(img):
cv.imshow('window_name', img)
cv.waitKey(0)
cv.destroyAllWindows()
for each_img in get_imgs('./myImages/'):
if counter == 10:
break
show_img(each_img)
I have so far look into store all images on own drive (and connect Colab to drive with the access key, little uncomfortable) or store them on raw-github.
In both scenarios I am not able to make it work the constructor, and I think is mainly because I can not go through the "folder" as I would do in local.
Any thoughts, recommendations?
Upvotes: 0
Views: 886
Reputation: 51857
In addition to @DraperDuck's helpful answer I'd like to point you another option: using Google Drive from the Colab Notebook
Personally I prefer the mounting the Drive and accessing a previously setup folder with the dataset:
from google.colab import drive
drive.mount('/content/drive')
(Side note: I can't remember if cv2.imshow
is handled by the Colab Jupyter Notebook environment (as usually cv2.imshow
requires an OS windowing system), but if that fails you can display the images in the notebook using matplotlib (plt.imshow
))
Upvotes: 0
Reputation: 2859
In Google Colab, there is a small folder icon on the left sidebar. Then you might have to click a folder icon at the top of the new file system sidebar. This allows you to gain access to all the folders of the virtual machine. To upload large image datasets, you can click the file upload button, or use wget to download through raw Github.
Upvotes: 1