Reputation: 453
I would like to code a simple variational auto-encoder in Keras. Because its variational, I have to use the functional API for the encoder. However, its giving me a dimension mismatch error and I cant figure out why. Here is my code and error:
def sampling(args):
z_mean, z_log_var = args
epsilon = K.random_normal(shape=(100,2),
mean=0., stddev=1)
return z_mean + K.exp(z_log_var) * epsilon
def testFcn():
K.clear_session()
# Create the input
inPut = Input(shape=(3,))
# Encoder Layers
xEnc = Dense(128, input_shape=(3,), activation='relu')(inPut)
xEnc = Dense(64, activation='relu')(xEnc)
xEnc = Dense(32, activation='relu')(xEnc)
# Distribution Embedding
z_mean = Dense(2, activation='relu')(xEnc)
z_log_var = Dense(2, activation='relu')(xEnc)
z = Lambda(sampling, output_shape=(2,))([z_mean, z_log_var])
# Tying together the model
encoder = Model(inPut, z, name='encoder')
print("\n Encoder Model")
encoder.summary()
return encoder
# Create some random data
X = np.random.multivariate_normal([0]*3,np.eye(3),size=(100))
# Create the model
encoder = testFcn()
# predict
encoder.predict(X)
This gives me the following error:
InvalidArgumentError: 2 root error(s) found.
(0) Invalid argument: Incompatible shapes: [32,2] vs. [100,2]
[[{{node lambda_1/mul}}]]
(1) Invalid argument: Incompatible shapes: [32,2] vs. [100,2]
[[{{node lambda_1/mul}}]]
[[lambda_1/add/_23]]
0 successful operations.
0 derived errors ignored.
Any help here is really appreciated.
Upvotes: 0
Views: 72
Reputation: 809
This might point you in the right direction - I was able to get success with this modification of adding the batch_size:
encoder.predict(X, batch_size=100)
The default is 32, which is conflicting with your design apparently. More details here: https://keras.io/models/model/#predict
I hope this helps.
Upvotes: 1