Jaffer Wilson
Jaffer Wilson

Reputation: 7273

How to forecast using the Tensorflow model?

I have created tensorflow program in order to for the close prices of the forex. I have successfully created the predcitions but failed understand the way to forecast the values for the future. See the following is my prediction function:

test_pred_list = []

def testAndforecast(xTest1,yTest1):
#     test_pred_list = 0
    truncated_backprop_length = 3
    with tf.Session() as sess:
    #     train_writer = tf.summary.FileWriter('logs', sess.graph)
        tf.global_variables_initializer().run()
        counter = 0
#         saver.restore(sess, "models\\model2298.ckpt")
        try:
            with open ("Checkpointcounter.txt","r") as file:
                value = file.read()
        except FileNotFoundError:
            print("First Time Running Training!....")  
        if(tf.train.checkpoint_exists("models\\model"+value+".ckpt")):
            saver.restore(sess, "models\\model"+value+".ckpt")
            print("models\\model"+value+".ckpt Session Loaded for Testing")
        for test_idx in range(len(xTest1) - truncated_backprop_length):

            testBatchX = xTest1[test_idx:test_idx+truncated_backprop_length,:].reshape((1,truncated_backprop_length,num_features))        
            testBatchY = yTest1[test_idx:test_idx+truncated_backprop_length].reshape((1,truncated_backprop_length,1))


            #_current_state = np.zeros((batch_size,state_size))
            feed = {batchX_placeholder : testBatchX,
                batchY_placeholder : testBatchY}

            #Test_pred contains 'window_size' predictions, we want the last one
            _last_state,_last_label,test_pred = sess.run([last_state,last_label,prediction],feed_dict=feed)
            test_pred_list.append(test_pred[-1][-1]) #The last one

Here is the complete jupyter and datasets for test and train:
My repository with code.

Kindly, help me how I can forecast the close values for the future. Please do not share something related to predictions as I have tried. Kindly, let me know something that will forecast without any support just on the basis of training what I have given.

I hope to hear soon.

Upvotes: 9

Views: 1371

Answers (2)

Rajesh Dua
Rajesh Dua

Reputation: 11

I am not sure if your issue was resolved. I was facing the similar problem and this is how I solved it -

  • I used the same code, made some adjustments in number of neurons, batch size, past history and the evaluation period.
  • I ran the code with multivariate data to predict one step ahead.
  • I fed the predicted value in the dataset
  • Recalculated the indicators.
  • And ran the loop for next 30 minutes (I was using 1 minute candle data)

This worked.

Let me know if you need further help.

Upvotes: 1

Břetislav Hájek
Břetislav Hájek

Reputation: 3686

If I understand your question correctly, by forecasting you mean predicting multiple closing prices in future (for example next 5 closing prices from current state). I went through your jupyter notebook. In short, you can not easily do that.

Right now your code takes the last three positions defined by multiple futures (open/low/high/close prices and some indicators values). Based on that you predict next closing price. If you would like to predict even further position, you would have to create an "artificial" position based on the predicted closing price. Here you can approximate that open price is same as previous closing, but you can only guess high and low prices. Then you would calculate other futures/values (from indicators) and use this position with previous two to predict next closing price. You can continue like this for future steps.

The issue is in the open/low/high prices because you can only approximate them. You could remove them from data, retrain the model, and make predictions without them, but they may be necessary for indicators calculations.


I somehow compressed your code here to show the approach of predicting all OHLC prices:

# Data
xTrain = datasetTrain[
    ["open", "high", "low", "close", "k",
     "d", "atr", "macdmain", "macdsgnal",
     "bbup", "bbmid", "bblow"]].as_matrix()
yTrain = datasetTrain[["open", "high", "low", "close"]].as_matrix()

# Settings
batch_size = 1
num_batches = 1000
truncated_backprop_length = 3
state_size = 12

num_features = 12
num_classes = 4

# Graph
batchX_placeholder = tf.placeholder(
    dtype=tf.float32,
    shape=[None, truncated_backprop_length, num_features],
    name='data_ph')
batchY_placeholder = tf.placeholder(
    dtype=tf.float32,
    shape=[None, num_classes],
    name='target_ph')


cell = tf.contrib.rnn.BasicRNNCell(num_units=state_size)
states_series, current_state = tf.nn.dynamic_rnn(
    cell=cell,
    inputs=batchX_placeholder,
    dtype=tf.float32)

states_series = tf.transpose(states_series, [1,0,2])

last_state = tf.gather(
    params=states_series,
    indices=states_series.get_shape()[0]-1)

weight = tf.Variable(tf.truncated_normal([state_size, num_classes]))
bias = tf.Variable(tf.constant(0.1, shape=[num_classes]))

prediction = tf.matmul(last_state, weight) + bias


loss = tf.reduce_mean(tf.squared_difference(last_label, prediction))
train_step = tf.train.AdamOptimizer(learning_rate=0.001).minimize(loss)

# Training
for batch_idx in range(num_batches):
    start_idx = batch_idx
    end_idx = start_idx + truncated_backprop_length


    batchX = xTrain[start_idx:end_idx,:].reshape(batch_size, truncated_backprop_length, num_features)
    batchY = yTrain[end_idx].reshape(batch_size, truncated_backprop_length, num_classes)


    feed = {batchX_placeholder: batchX, batchY_placeholder: batchY}

    _loss, _train_step, _pred, _last_label,_prediction = sess.run(
        fetches=[loss, train_step, prediction, last_label, prediction],
        feed_dict=feed)

I think it is not important to write the whole code plus I don't know how are the indicators calculated. Also you should change way of data feeding because right now it only works with batches os size 1.

Upvotes: 3

Related Questions