Hitman01
Hitman01

Reputation: 1

Least squared optimization with SciPy but without smoothing

I'm currently trying to unfold the neutron spectrum using Bonner spheres spectrometry (which is essentially an under-determined problem). I have been using least_squares from scipy.optimize but I'm getting the feeling that there's a smoothing step which is "flattening" my spectrum. The peaks are being broadened and I feel there is some sort of continuity conditions or smoothing steps that are being used to a achieve a somewhat less optimal results. Given that nature of the problem, there should be some narrow peaks (neutrons at a given energy value) for the number of counts/measurements to become accurate.

My function is the following:


def objective(spectrum, kernel, counts, energies):
    residual = np.matmul(kernel, spectrum) - counts
    return residual

def L2_regularization_alogrithm(alpha, beta, kernel, guess, counts, energies):

    guess_L2_reg = guess.reshape((len(guess),))
    counts_L2_reg = counts.reshape((len(counts),))
    used_kernel = kernel.copy()

    kernel_T = np.transpose(used_kernel)
    K = np.dot(kernel_T, used_kernel)
    LM = np.diag(np.diag(K))
    transfer = K + alpha * LM
    counts_L2_reg = np.matmul(kernel_T, counts_L2_reg)

    bounds = (0, np.inf)
    result = least_squares(objective, guess_L2_reg, args=(transfer, counts_L2_reg, energies), bounds=bounds)
    x = result.x
    return x.reshape((len(x), 1))

Does anyone has any idea how to solve this kind of problems ?

I tried different types of smoothing methods to reduce this effect, the only options were linear, soft_l1, huber, cauchy and arctan. This fucntion is giving the best results but still far from perfect. And I know that problem is solvable because I'm using a commercial software that's giving me very good results but I don't have access to its internal code. You can see the current result here:

Best result so far

Upvotes: 0

Views: 42

Answers (0)

Related Questions