Reputation: 173
I'm trying to merge multiples images (500 pics) together in a for
loop. The image size is constant and even not too big (225*410 px). What I need is to obtain a single image composed by the 500 initial pics stuck together side by side.
I've tried with a for
loop using some functions of package EBImage. abind()
it's like the traditional rbind()
. The code I've used is the following:
library(abind)
#path=a list containing the paths of the source images
final_image<-readImage(path[1]) #initialize the final image
for (i in 2:500){
im <- readImage(path[i]) #open the i-esim image
final_image <- abind(final_image,im,along=1) #paste the i-esim image with the previous one
}
The code works but, obviously, it's really slow because at each iteration the size of final_image
gets bigger.
Does anyone know a faster workaround? Thanks!
Upvotes: 3
Views: 1725
Reputation: 160447
In general, iteratively rbind
ing (goes for other *bind
funcs, too) is a really bad idea, as it makes a complete copy with each iteration in the loop (as you noticed). Notice that in ?abind
, it takes ...
:
... Any number of vectors, matrices, arrays, or data frames. The
dimensions of all the arrays must match, except on one dimension
(specified by along=). If these arguments are named, the name will be
used for the name of the dimension along which the arrays are joined.
Vectors are treated as having a dim attribute of length one.
which allows us to use do.call
to do the binding all at once on a single list of all images. Try this (untested):
list_of_images <- lapply(path, readImage)
combined <- do.call(abind, c(list_of_images, list(along = 1)))
Upvotes: 3