Reputation: 91
Could someone give me a detailed explanation of how to calculate run-time, CPU usage and memory usage of this Python program that makes forecasts using ARIMA model:
def ARIMA_forecast(series, df):
X = series.values
size = int(len(X) * 0.66)
train, test = X[0:size], X[size:len(X)]
history = [x for x in train]
predictions = list()
start_time = time.process_time()
for t in range(len(test)):
model = ARIMA(history, order=(4, 1, 0))
model_fit = model.fit(disp=0)
output = model_fit.forecast()
yhat = output[0]
predictions.append(yhat)
obs = test[t]
history.append(obs)
print('predicted=%f, expected=%f' % (yhat, obs))
# evaluate forecasts
end_time = time.process_time()
print(end_time - start_time)
rmse = sqrt(mean_squared_error(test, predictions))
print('Test RMSE: %.3f' % rmse)
# plot forecasts against actual outcomes
plt.plot(series, label='Training data')
plt.plot(series[size:len(X)].index, predictions, color='blue', label='Predicted Price')
plt.plot(series[size:len(X)].index, test, color='red', label='Actual Price')
plt.legend()
plt.show()
df = pd.read_csv('MSFT.csv', header=0, index_col=0, parse_dates=True)
series = df['Adj Close']
ARIMA_forecast(series, df)
Right now I am using time.pricess_time() to meassur the run-time. I am not sure how to calculate CPU and memory usage. I have reaad about using psutil, but I don't know how exactly to do this.
Upvotes: 0
Views: 328
Reputation: 1465
for memory profiling i've used pympler and Guppy they works well. For CPU profiling, you code seem to be cpu intensive so cpu time is the same of execution time. Other than that i've developed perf_tools that in addition to the profiling that you can do with time.*_time() it can show you the time consumed for each part of your code. For example the loop in the funcion can be difficult to profile well and you can profile exaclty cpu intensive Vs plot side.
Upvotes: 1