mathnoob123
mathnoob123

Reputation: 229

Visualizing plot decision boundary by a Keras model

I am trying to plot a decision plot boundary of model prediction by Keras. However, the boundary that is generated seems incorrect.

Here's my model

def base():
    model = Sequential()
    model.add(Dense(5,activation = 'relu', input_dim = 2))
    model.add(Dense(2,activation = 'relu'))
    model.add(Dense(1,activation = 'sigmoid'))
    model.compile(optimizer = optimizers.SGD(lr=0.0007, momentum=0.0, decay=0.0), loss = 'binary_crossentropy', metrics= ['accuracy'])      
    return model 

model = base()
history = model.fit(train_X,train_Y, epochs = 10000, batch_size =64, verbose = 2)

And here's my plot function (taken from here)

def plot_decision_boundary(X, y, model, steps=1000, cmap='Paired'):
    """
    Function to plot the decision boundary and data points of a model.
    Data points are colored based on their actual label.
    """
    cmap = get_cmap(cmap)

    # Define region of interest by data limits
    xmin, xmax = X[:,0].min() - 1, X[:,0].max() + 1
    ymin, ymax = X[:,1].min() - 1, X[:,1].max() + 1
    steps = 1000
    x_span = linspace(xmin, xmax, steps)
    y_span = linspace(ymin, ymax, steps)
    xx, yy = meshgrid(x_span, y_span)

    # Make predictions across region of interest
    labels = model.predict(c_[xx.ravel(), yy.ravel()])

    # Plot decision boundary in region of interest
    z = labels.reshape(xx.shape)

    fig, ax = subplots()
    ax.contourf(xx, yy, z, cmap=cmap, alpha=0.5)

    # Get predicted labels on training data and plot
    train_labels = model.predict(X)
    ax.scatter(X[:,0], X[:,1], c=y.ravel(), cmap=cmap, lw=0)

    return fig, ax
plot_decision_boundary(train_X,train_Y, model, cmap = 'RdBu')

And I get a plot like this

Image

Which is obviously a very flawed depiction of a plot decision boundary (not informative at all due to the presence of so many boundaries). Can somebody point the error in my case?

Upvotes: 3

Views: 5928

Answers (1)

Lokesh Kumar
Lokesh Kumar

Reputation: 909

Since probability is a continuous value from 0 to 1, we are getting many contours.

If your visualization is restricted to 2 classes (output is 2D softmax vector) you can use this simple code

def plot_model_out(x,y,model):
  """
  x,y: 2D MeshGrid input
  model: Keras Model API Object
  """
  grid = np.stack((x,y))
  grid = grid.T.reshape(-1,2)
  outs = model.predict(grid)
  y1 = outs.T[0].reshape(x.shape[0],x.shape[0])
  plt.contourf(x,y,y1)
  plt.show()

This will give contours (more than one), if you want a single contour line you can do the following

You can threshold the probability output from model.predict and display a single contour line.

For Example,

import numpy as np 
from matplotlib import pyplot as plt 

a = np.linspace(-5, 5, 100)
xx, yy = np.meshgrid(a,a)
z = xx**2 + yy**2
# z = z > 5 (Threshold value)
plt.contourf(xx, yy, z,)
plt.show()

With threshold value commented and not commented we get 2 images

multiple conoturs

Multiple contours due to continuous values

single contour

Single contour as the z is thresholded (z = z > 5)

A similar method can be used on the output softmax vector like this

label = label > 0.5

For more information regarding visualization codes refer IITM CVI Blog

Upvotes: 4

Related Questions