Reputation: 55
I'm working on a regression problem to reconstruct the information to mitigate crosstalk effects from neighbors' sensors, and I want to change the loss function to evaluate the standard deviation in an intermediate step inside my custom loss function.
Here is the code that I want to implement:
My y_pred
as shape (nEvents, 100)
I found many examples to create my loss function as below:
import backend as K
def my_loss_func1(y_true, y_pred):
return K.mean(K.square(y_pred - y_actual) + K.square(layer), axis=-1)
But in those examples, I didn't find how to access data inside the y_pred
(TensorFlow variable). In all examples, it just uses a function from the backend to evaluate the loss passing y_pred
, y_true
as a parameter.
What I need to do is:
def my_loss_func2(y_true, y_pred):
samples = y_pred( <HOW DO TO GET THE VALUES?> )
cells = int(samples.shape[1]/4)
AmpRec = np.tensordot(samples.reshape(samples.shape[0], signals, nSamp),ai, axes=(2,0))
TimeRec = np.tensordot(samples.reshape(samples.shape[0], signals, nSamp),ai, axes=(2,0))/AmpRec
return K.std(TimeRec)
I want to improve and update my model to adjust weights, taking into account the optimal_filter output concerning the timeRec
spread (std).
Has anybody evaluated the loss function accessing y_pred
like I want to do?
Upvotes: 1
Views: 140
Reputation: 55
Today, a few days after...
I found a solution, reading the post how to print Keras tensors.
There exist some methods to manipulate a tensor, like:
K.eva(tensor)
K.print_tensor(tensor)
...
But to enable those, it's necessary to add run_eagerly on the model compile method:
model.compile(..., run_eagerly = True)
After this single step, you can access tensor information to manipulate as you need.
Upvotes: 1