Reputation: 2061
I'm trying to do something similar to Make a custom loss function in keras, but struggling at implementation.
I have some data that relates age to failures:
# make some data
times = pd.array([10 ,15, 22, 30, 4, 17, 38, 12, 17, 22])
events = pd.array([0, 1, 1, 1, 0, 1, 1, 0, 0, 1])
data = pd.DataFrame({'age':times, 'failure':events})
I have a parameterized function that is used to make predictions:
# this gives the y_pred values
def calc_prob(param1, param2, param3, age):
prob = (((100*param1*pow((100/param3),-(pow((age/param2),param1))))*pow(age/param2,param1)*math.log(100/param3))/age)/100
return prob
and I have a metric I'd like to use in the cost function. I want the neural net to estimate parameters that minimize this function:
import statistics
# this is the metric to minimize
def brier_score(y_true, y_pred):
# the brier score acts as an MSE metric
brier_score = statistics.mean(pow((y_pred - y_true),2))
return(brier_score)
from keras import models
from keras import layers
def build_model():
model = models.Sequential()
model.add(layers.Dense(1, activation='relu', input_shape=(data.shape[1],)))
model.add(layers.Dense(5, activation='relu'))
model.add(layers.Dense(3))
model.compile(optimizer='rmsprop', loss='mse', metrics=['mae'])
return model
The output of the model is three parameters that I would like to pass to calc_prob()
to be used within the loss function. If those parameters at a particular iteration are defined like the values below, I want it to use the calc_prob()
function to predict the values within the loss function
params = [2.64, 30, 40]
y_pred = calc_prob(params[0], params[1], params[2], data['age'])
y_true = data['failure']
The cost function would be
brier_score(y_true, y_pred)
0.5672474859914267
I'm not sure how to properly wrap these functions so that I can use them similar to:
model.compile(loss='brier_score', optimizer='adam', metrics=['accuracy'])
Upvotes: 1
Views: 403
Reputation: 683
I'm not sure I understood you correctly.
Put "age" into labels
x = ...
y = {"age": data['age'], "prob": data['failure']}
def brier_score(y_true, y_pred):
prob = calc_prob(y_pred[:, 0], y_pred[:, 1], y_pred[:, 2], y_true["age"])
brier_score = tf.reduce_mean((prob - y_true["prob"]) ** 2, axis=1)
return brier_score
model.compile(loss=brier_score, optimizer='adam', metrics=['accuracy'])
model.fit(x=x, y=y, ...)
In calc_prob
, just replace pow
by tf.math.pow
.
In order to have positive NN output for calc_prob
you can add one more activation function in the last layer. softplus would be probably a better choice then relu.
Upvotes: 1