Multiple Regression on Tensorflow

I'm really new at StackOverflow, so please forgive me if I say something stupid. I was coding this multiple linear regression in tensorflow library but due to some reason it just doesn't work, the loss just increases and then becomes none.

# coding: utf-8

import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt
import seaborn as sns
get_ipython().magic('matplotlib inline')

from sklearn.datasets import load_boston    

data=load_boston()

X_data = data.data
y_data = data.target
m = len(X_data)
n = len(X_data[0])

X = tf.placeholder(tf.float32,[m,n])
y = tf.placeholder(tf.float32,[m,1])

W = tf.Variable(tf.ones([n,1]))
b = tf.Variable(tf.ones([1]))

y_ = tf.matmul(X,W)+b

loss = tf.reduce_mean(tf.square( y - y_))

optimizer = tf.train.GradientDescentOptimizer(0.01)
train = optimizer.minimize(loss)

with tf.Session() as sess:
    init = tf.global_variables_initializer()
    sess.run(init)
    vals = []
    for i in range(100):
        val = sess.run([train,loss],feed_dict={X:X_data , y:y_data[:,None]})
        vals.append(val)

print(vals)

Output is :

[[None, 823712.88],
 [None, 3.2238912e+13],
 [None, 1.2631078e+21],
 [None, 4.9488092e+28],
 [None, 1.9389255e+36],
 [None, inf],
 [None, inf],
 [None, inf],
 [None, inf],
 [None, inf],
 [None, inf],
 [None, nan],
 [None, nan],
 [None, nan],
 [None, nan],
 [None, nan],
...
 [None, nan],
 [None, nan]]

I can't find where it's going wrong...Help? Anyone?

Upvotes: 2

Views: 1126

Answers (1)

cnapun
cnapun

Reputation: 93

It appears your learning rate is too high. If you decrease the learning rate to something like 1e-6 this converges.

Upvotes: 1

Related Questions