Donal Huang
Donal Huang

Reputation: 11

Worries about break of Computational graph and not supported for autograd

For some reason, when changing my loss function in torch, I have to use numpy's functions to compute. But I'm very worried about whether the use of numpy function would make the autograd function fail. I would really appreciate it if you can tell me when I have to care about the computational graph and when I don't. In this codes I guess it won't influence.Here are my detailed code:

from scipy.ndimage import distance_transform_edt

class Distance_Loss(nn.Module):
    def __init__(self):
        super(Distance_Loss, self).__init__()
        self.alpha = .6
        self.beta = .4

        self.MSELoss = nn.MSELoss()
    def forward(self, input, target):
        return MSELoss(torch.tensor(distance_transform_edt(input).cpu().numpy()), target)
# input may be the output of my own Network, target is my Ground -Truth

Or I really want to know that how can I check all grad property right!

If it would influence the grad, would following codes useful?

x_numpy = x.cpu().numpy()
x_restored = torch.from_numpy(x_numpy).to(x.device)
x_restored.requires_grad = x.requires_grad

Upvotes: 0

Views: 38

Answers (0)

Related Questions