Henri.D
Henri.D

Reputation: 145

How to create a neural network with multiple output layers (Julia, Flux)

Say I have the following neural network

net = Chain(Dense(3,5,\sigma), Dense(5,2, ???))

I would like to know what to put instead of ??? so that my first output neuron goes through a RELU activation function and the second a sigmoid function. This network's output is a pair of actions to perform, the first is a positive real value and the other a binary choice.

I cannot define a custom "relu_sigma" function that would make the choice because the way it works is that the activation functions take a single value, not an array. So I cannot make a function that knows if it is taking the first or the second Wx+b as argument.

More generally I would like to know how to make this kind of network with any number of functions on any number of neurons (e.g. 5 relu, 2 sigmoid, and a softmax on the last 4)

Upvotes: 2

Views: 725

Answers (1)

Henri.D
Henri.D

Reputation: 145

I defined a custom layer type as follows. It is not as general, it only applies relu to the first half of neurons and sigma to the second half. But that's what I wanted for my applications, generalization should not be too compicated to figure out.

struct reluSigma
    W
    b
end
reluSigma(in::Integer, out::Integer) =  reluSigma(param(randn(out, in)), param(randn(out)))

function (m::reluSigma)(x)
    l = Int(length(m.b)/2)
    r1 = 1:l
    r2 = l+1:length(m.b)
    vcat(relu(m.W[r1,:] * x .+ m.b[r1,:]), σ(m.W[r2,:] * x .+ m.b[r2,:]))
end

Upvotes: 3

Related Questions