Reputation: 890
I have a univariate time series data. I want to do a multistep prediction.
I came across this question which explains time series one step prediction. but I am interested in multistep ahead prediction.
e.g typical univariate time series data looks like
time value
---- ------
t1 a1
t2 a2
..........
..........
t100 a100.
Suppose, I want 3 step ahead prediction. Can I frame my problem like
TrainX TrainY
[a1,a2,a3,a4,a5,a6] -> [a7,a8,a9]
[a2,a3,a4,a5,a6,a7] -> [a8,a9,a10]
[a3,a4,a5,a6,a7,a8] -> [a9,a10,a11]
.................. ...........
.................. ...........
I am using keras and tensorflow as backend
First layer has 50 neurons and expects 6 inputs. hidden layer has 30 neurons output layer has 3 neurons i.e (outputs three time series values)
model = Sequential()
model.add(Dense(50, input_dim=6, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
model.add(Dense(30, activation='relu',kernel_regularizer=regularizers.l2(0.01)))
model.add(Dense(3))
model.compile(loss='mean_squared_error', optimizer='adam')
model.fit(TrainX, TrainY, epochs=300, batch_size=16)
My model will be able to predict a107,a108,a109 ,when my input is a101,a102,a103,a104,a105,a106 Is this a valid model ? Am I missing some thing?
Upvotes: 1
Views: 479
Reputation: 86600
That model might do it, but you should probably benefit from using LSTM
layers (recurrent networks for sequences).
#TrainX.shape = (total of samples, time steps, features per step)
#TrainX.shape = (total of samples, 6, 1)
model.add(LSTM(50,input_shape=(6,1),return_sequences=True, ....))
model.add(LSTM(30,return_sequences=True, ....))
model.add(LSTM(3,return_sequences=False, ....))
You may be missing an activation function that limits the result to the possible range of the value you want to predict.
Often we work with values from 0 to 1 (activation='sigmoid'
) or from -1 to 1 (activation='tanh'
).
This would also require that the input be limited to these values, since inputs and outputs are the same.
Upvotes: 2