Paul Chilvers
Paul Chilvers

Reputation: 75

Is there a way to impose a constraint in tensor flow, could I enforce some rule along the way?

Is there some way to a constraint on the data generated by tensor flow, for example if my model produced two outputs can you impose some sort of constraint on these, like if a and b where the outputs could you pre-enforce something like (a+b)/2<10? So the model wouldn't break this rule?

Thanks in advance

Upvotes: 4

Views: 444

Answers (1)

kafman
kafman

Reputation: 2860

If by "generated by TensorFlow" you mean generated by a neural network, I don't think it is possible to do that in general. You can't really guarantee that the output of a neural network never violates such hard constraints in general, especially at test time.

Here's what you could do:

  • Add a loss term, something like max(0, (a+b)/2 - 10). This will not guarantee that your constraint is not violated (the optimization of the NN is "best-effort"). This loss function is btw very similar to the hinge loss used in support vector machines.
  • Use an appropriate activation function. E.g. if you know your data must lie between [0, 1], use the sigmoid activation on the output.
  • "Project" the output back to the allowed range if it is outside of it.

While the last two options guarantee feasibility, it is not always possible to do that or it is not clear how to do it and - even worse - how this will affect the learning. For example, if you see that (a+b)/2 >= 10 what will you do? Will you decrease b until the constraint is fulfilled, or both trade-off a and b somehow? Sometimes it is possible to define the "closest feasible point" w.r.t. some metric, but not in general.

Upvotes: 1

Related Questions