Reputation: 1
Hi im tying to multiply a tensor with a matrix in the following fashion:
dimensions
W: a x b x c
V: a x c
I want Z such that
Z[i]=dot(W[i],V[i])
Z is then of dimension a x ( (b x c) . (c x 1))
, so (a x b)
Ive tried numpy.tensordot
to do this but havent been able to. Can it do what I want? If not how can I do this WITHOUT loops.
Basically the equivalent of
def f(W,V):
Z=[]
for i in range(len(W)):
Z.append(dot(W[i],V[i]))
return Z
Thanks
edit: Specifically is this achievable with tensordot?
Upvotes: 0
Views: 2296
Reputation: 1516
What about
import numpy as np
a,b,c=3,5,6
r=np.random.random
W = r((a,b,c))
V = r((a,c))
Z = np.sum(W*V[:,np.newaxis,:],axis=2)
Doesn't use loops or newer features and should be reasonably fast. Comparing with "z_loop" from J.F. Sebastian's post:
print np.sum(np.abs(Z-z_loop(W,V)))
gives 4.99600361081e-16
.
Upvotes: 2
Reputation: 414865
np.einsum("abc,ac -> ab", w, v)
:import numpy as np
def z_loop(w,v): # define it to check that `einsum()` gives necessary result
z = np.empty(w.shape[:-1], dtype=w.dtype)
for i in range(z.shape[0]):
z[i,:] = np.dot(w[i,:], v[i,:])
return z
w = np.random.uniform(size=(3,4,5))
v = np.random.uniform(size=w.shape[::2])
assert np.allclose(z_loop(w, v), np.einsum('abc,ac -> ab', w, v))
There might be simpler variants (via dot()
, .reshape()
) but einsum()
is the most obvious for the task description.
def z_dot(w, v):
z = np.dot(w, v[:,...,np.newaxis])
z = z.reshape(z.shape[:-1])
return np.diagonal(z, axis2=-1).T
assert np.allclose(z_dot(w, v), np.einsum('abc,ac -> ab', w, v))
Upvotes: 5