Reputation: 2766
I'm working on an application that should predict interesting moments in audio files with a length of 10 seconds. I converted each 50ms of audio to a note, so each my record has 1 label (1,0 - interesting or not) and 200 note features. Then I created 200 train examples:
from __future__ import absolute_import, division, print_function, unicode_literals
from google.colab import drive
import functools
import tensorflow as tf
import tensorflow_datasets as tfds
from google.colab import drive
drive.mount('/content/gdrive')
def get_dataset(file_path):
dataset = tf.data.experimental.make_csv_dataset(
file_path,
batch_size=12
label_name='label',
na_value='?',
num_epochs=1,
ignore_errors=False)
return dataset
train = get_dataset('/content/gdrive/My Drive/myProject/train.csv')
test = get_dataset('/content/gdrive/My Drive/myProject/test.csv')
feature_columns = []
for number in range(200):
feature_columns.append(tf.feature_column.numeric_column('note' + str(number + 1) ))
preprocessing_layer = tf.keras.layers.DenseFeatures(feature_columns)
model = tf.keras.Sequential([
preprocessing_layer,
tf.keras.layers.Dense(50, activation=tf.nn.relu),
tf.keras.layers.Dense(2, activation=tf.nn.softmax)
])
model.compile(
loss='binary_crossentropy',
optimizer='adam',
metrics=['accuracy'])
model.fit(train, epochs=20)
Then my model returns such output on 20 epoch:
17/17 [==============================] - 0s 7ms/step - loss: 0.6959 - acc: 0.5000
What am I doing wrong?
Upvotes: 0
Views: 2105
Reputation: 1541
You are using tf.nn.softmax
activation hence a single class is correct for each prediction, therefor you should be using categorical_crossentropy
instead of binary_crossentropy
as loss function. I don't know if this is your only problem, but that should at least solve one.
You could also use a single output class, with sigmoid
activation, and binary_crossentropy
loss. Instead of having two classes 1. "interesting" or 2. "not-interesting" (which is always the inverse of the other). Then you would be training with a single boolean "interesting" [1/0]
You can read more about softmax
, sigmoid
, binary_crossentropy
and categorical_crossentropy
here
Upvotes: 3