Apple Stock-Price Forecasting
Laying down future stocks precise to the dot. In other words, an attempt to do the impossible.
• 28 min read
- Project At A Glance
- Dataset
- Scaling
- Train-Test Split
- Time-Series Window
- Model Setup and Layers
- Prediction and Metrics
- Present Forecast Plot
- Future Extension
- Predicting the Next 30 Days
Project At A Glance
Objective
: Forecast and extrapolate prices for the Apple Stock (AAPL) over time using Time-Series data from the last five years.
Data
: AAPL Stock Dataset from Tiingo API [Download]
Implementation
: Time-Series Forecasting, Stacked Long Short-Term Memory (LSTM), Scaling and Transforms
Results
:
- Visualized forecasting for the stipulated period of 5 years.
- Extended values from the 100 most recent days to predict the next 30 days.
- The model projected a mild plateau in future valuation on the time of project instantiation.
Deployment
: View this project on GitHub.
import pandas_datareader as pdr
key=""
df = pdr.get_data_tiingo('AAPL', api_key=key)
df.to_csv('AAPL.csv')
import pandas as pd
df=pd.read_csv('AAPL.csv')
df.head()
Unnamed: 0 | symbol | date | close | high | low | open | volume | adjClose | adjHigh | adjLow | adjOpen | adjVolume | divCash | splitFactor | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0 | AAPL | 2015-05-27 00:00:00+00:00 | 132.045 | 132.260 | 130.05 | 130.34 | 45833246 | 121.682558 | 121.880685 | 119.844118 | 120.111360 | 45833246 | 0.0 | 1.0 |
1 | 1 | AAPL | 2015-05-28 00:00:00+00:00 | 131.780 | 131.950 | 131.10 | 131.86 | 30733309 | 121.438354 | 121.595013 | 120.811718 | 121.512076 | 30733309 | 0.0 | 1.0 |
2 | 2 | AAPL | 2015-05-29 00:00:00+00:00 | 130.280 | 131.450 | 129.90 | 131.23 | 50884452 | 120.056069 | 121.134251 | 119.705890 | 120.931516 | 50884452 | 0.0 | 1.0 |
3 | 3 | AAPL | 2015-06-01 00:00:00+00:00 | 130.535 | 131.390 | 130.05 | 131.20 | 32112797 | 120.291057 | 121.078960 | 119.844118 | 120.903870 | 32112797 | 0.0 | 1.0 |
4 | 4 | AAPL | 2015-06-02 00:00:00+00:00 | 129.960 | 130.655 | 129.32 | 129.86 | 33667627 | 119.761181 | 120.401640 | 119.171406 | 119.669029 | 33667627 | 0.0 | 1.0 |
df.tail()
Unnamed: 0 | symbol | date | close | high | low | open | volume | adjClose | adjHigh | adjLow | adjOpen | adjVolume | divCash | splitFactor | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1253 | 1253 | AAPL | 2020-05-18 00:00:00+00:00 | 314.96 | 316.50 | 310.3241 | 313.17 | 33843125 | 314.96 | 316.50 | 310.3241 | 313.17 | 33843125 | 0.0 | 1.0 |
1254 | 1254 | AAPL | 2020-05-19 00:00:00+00:00 | 313.14 | 318.52 | 313.0100 | 315.03 | 25432385 | 313.14 | 318.52 | 313.0100 | 315.03 | 25432385 | 0.0 | 1.0 |
1255 | 1255 | AAPL | 2020-05-20 00:00:00+00:00 | 319.23 | 319.52 | 316.2000 | 316.68 | 27876215 | 319.23 | 319.52 | 316.2000 | 316.68 | 27876215 | 0.0 | 1.0 |
1256 | 1256 | AAPL | 2020-05-21 00:00:00+00:00 | 316.85 | 320.89 | 315.8700 | 318.66 | 25672211 | 316.85 | 320.89 | 315.8700 | 318.66 | 25672211 | 0.0 | 1.0 |
1257 | 1257 | AAPL | 2020-05-22 00:00:00+00:00 | 318.89 | 319.23 | 315.3500 | 315.77 | 20450754 | 318.89 | 319.23 | 315.3500 | 315.77 | 20450754 | 0.0 | 1.0 |
df1=df.reset_index()['close']
df1
0 132.045 1 131.780 2 130.280 3 130.535 4 129.960 ... 1253 314.960 1254 313.140 1255 319.230 1256 316.850 1257 318.890 Name: close, Length: 1258, dtype: float64
import matplotlib.pyplot as plt
plt.plot(df1)
[<matplotlib.lines.Line2D at 0x2d1a92724e0>]
import numpy as np
df1
0 132.045 1 131.780 2 130.280 3 130.535 4 129.960 ... 1253 314.960 1254 313.140 1255 319.230 1256 316.850 1257 318.890 Name: close, Length: 1258, dtype: float64
from sklearn.preprocessing import MinMaxScaler
scaler=MinMaxScaler(feature_range=(0,1))
df1=scaler.fit_transform(np.array(df1).reshape(-1,1))
print(df1)
[[0.17607447] [0.17495567] [0.16862282] ... [0.96635143] [0.9563033 ] [0.96491598]]
training_size=int(len(df1)*0.65)
test_size=len(df1)-training_size
train_data,test_data=df1[0:training_size,:],df1[training_size:len(df1),:1]
training_size,test_size
(817, 441)
import numpy
def create_dataset(dataset, time_step=1):
dataX, dataY = [], []
for i in range(len(dataset)-time_step-1):
a = dataset[i:(i+time_step), 0] ###i=0, 0,1,2,3-----99 100
dataX.append(a)
dataY.append(dataset[i + time_step, 0])
return numpy.array(dataX), numpy.array(dataY)
time_step = 100
X_train, y_train = create_dataset(train_data, time_step)
X_test, ytest = create_dataset(test_data, time_step)
print(X_train.shape), print(y_train.shape)
(716, 100) (716,)
(None, None)
print(X_test.shape), print(ytest.shape)
(340, 100) (340,)
(None, None)
X_train =X_train.reshape(X_train.shape[0],X_train.shape[1] , 1)
X_test = X_test.reshape(X_test.shape[0],X_test.shape[1] , 1)
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import LSTM
model=Sequential()
model.add(LSTM(50,return_sequences=True,input_shape=(100,1)))
model.add(LSTM(50,return_sequences=True))
model.add(LSTM(50))
model.add(Dense(1))
model.compile(loss='mean_squared_error',optimizer='adam')
model.summary()
Model: "sequential_3" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= lstm_7 (LSTM) (None, 100, 50) 10400 _________________________________________________________________ lstm_8 (LSTM) (None, 100, 50) 20200 _________________________________________________________________ lstm_9 (LSTM) (None, 50) 20200 _________________________________________________________________ dense_3 (Dense) (None, 1) 51 ================================================================= Total params: 50,851 Trainable params: 50,851 Non-trainable params: 0 _________________________________________________________________
model.fit(X_train,y_train,validation_data=(X_test,ytest),epochs=100,batch_size=64,verbose=1)
Epoch 1/100 12/12 [==============================] - 6s 487ms/step - loss: 0.0206 - val_loss: 0.0505 Epoch 2/100 12/12 [==============================] - 4s 309ms/step - loss: 0.0035 - val_loss: 0.0046 Epoch 3/100 12/12 [==============================] - 4s 300ms/step - loss: 0.0014 - val_loss: 0.0040 Epoch 4/100 12/12 [==============================] - 3s 287ms/step - loss: 8.1361e-04 - val_loss: 0.0073 Epoch 5/100 12/12 [==============================] - 3s 290ms/step - loss: 6.6860e-04 - val_loss: 0.0062 Epoch 6/100 12/12 [==============================] - 3s 255ms/step - loss: 6.4653e-04 - val_loss: 0.0062 Epoch 7/100 12/12 [==============================] - 3s 291ms/step - loss: 6.6186e-04 - val_loss: 0.0062 Epoch 8/100 12/12 [==============================] - 4s 300ms/step - loss: 6.2498e-04 - val_loss: 0.0049 Epoch 9/100 12/12 [==============================] - 4s 297ms/step - loss: 6.2745e-04 - val_loss: 0.0042 Epoch 10/100 12/12 [==============================] - 4s 303ms/step - loss: 6.0206e-04 - val_loss: 0.0050 Epoch 11/100 12/12 [==============================] - 4s 298ms/step - loss: 5.9884e-04 - val_loss: 0.0061 Epoch 12/100 12/12 [==============================] - 4s 304ms/step - loss: 6.1458e-04 - val_loss: 0.0044 Epoch 13/100 12/12 [==============================] - 4s 304ms/step - loss: 5.6830e-04 - val_loss: 0.0041 Epoch 14/100 12/12 [==============================] - 3s 262ms/step - loss: 5.5734e-04 - val_loss: 0.0038 Epoch 15/100 12/12 [==============================] - 3s 244ms/step - loss: 5.5456e-04 - val_loss: 0.0034 Epoch 16/100 12/12 [==============================] - 3s 277ms/step - loss: 5.3865e-04 - val_loss: 0.0034 Epoch 17/100 12/12 [==============================] - 3s 271ms/step - loss: 5.3872e-04 - val_loss: 0.0032 Epoch 18/100 12/12 [==============================] - 3s 260ms/step - loss: 5.2315e-04 - val_loss: 0.0030 Epoch 19/100 12/12 [==============================] - 3s 275ms/step - loss: 5.1791e-04 - val_loss: 0.0029 Epoch 20/100 12/12 [==============================] - 3s 274ms/step - loss: 5.0077e-04 - val_loss: 0.0028 Epoch 21/100 12/12 [==============================] - 3s 273ms/step - loss: 4.8672e-04 - val_loss: 0.0032 Epoch 22/100 12/12 [==============================] - 3s 270ms/step - loss: 4.9148e-04 - val_loss: 0.0026 Epoch 23/100 12/12 [==============================] - 3s 283ms/step - loss: 4.9279e-04 - val_loss: 0.0026 Epoch 24/100 12/12 [==============================] - 4s 308ms/step - loss: 5.2013e-04 - val_loss: 0.0024 Epoch 25/100 12/12 [==============================] - 3s 275ms/step - loss: 5.7301e-04 - val_loss: 0.0024 Epoch 26/100 12/12 [==============================] - 4s 295ms/step - loss: 5.5014e-04 - val_loss: 0.0030 Epoch 27/100 12/12 [==============================] - 4s 301ms/step - loss: 4.8608e-04 - val_loss: 0.0022 Epoch 28/100 12/12 [==============================] - 3s 278ms/step - loss: 4.4525e-04 - val_loss: 0.0022 Epoch 29/100 12/12 [==============================] - 4s 299ms/step - loss: 4.2446e-04 - val_loss: 0.0028 Epoch 30/100 12/12 [==============================] - 4s 302ms/step - loss: 4.9896e-04 - val_loss: 0.0023 Epoch 31/100 12/12 [==============================] - 3s 278ms/step - loss: 4.7568e-04 - val_loss: 0.0022 Epoch 32/100 12/12 [==============================] - 4s 294ms/step - loss: 4.3184e-04 - val_loss: 0.0027 Epoch 33/100 12/12 [==============================] - 4s 292ms/step - loss: 4.1365e-04 - val_loss: 0.0025 Epoch 34/100 12/12 [==============================] - 3s 276ms/step - loss: 4.0967e-04 - val_loss: 0.0022 Epoch 35/100 12/12 [==============================] - 3s 250ms/step - loss: 3.9084e-04 - val_loss: 0.0018 Epoch 36/100 12/12 [==============================] - 3s 291ms/step - loss: 3.8744e-04 - val_loss: 0.0016 Epoch 37/100 12/12 [==============================] - 3s 254ms/step - loss: 3.6441e-04 - val_loss: 0.0024 Epoch 38/100 12/12 [==============================] - 3s 272ms/step - loss: 4.3088e-04 - val_loss: 0.0025 Epoch 39/100 12/12 [==============================] - 3s 259ms/step - loss: 4.1398e-04 - val_loss: 0.0016 Epoch 40/100 12/12 [==============================] - 3s 274ms/step - loss: 3.8981e-04 - val_loss: 0.0016 Epoch 41/100 12/12 [==============================] - 3s 261ms/step - loss: 3.4896e-04 - val_loss: 0.0028 Epoch 42/100 12/12 [==============================] - 3s 282ms/step - loss: 3.7910e-04 - val_loss: 0.0014 Epoch 43/100 12/12 [==============================] - 3s 274ms/step - loss: 3.6404e-04 - val_loss: 0.0022 Epoch 44/100 12/12 [==============================] - 3s 277ms/step - loss: 3.8073e-04 - val_loss: 0.0014 Epoch 45/100 12/12 [==============================] - 3s 276ms/step - loss: 4.0008e-04 - val_loss: 0.0016 Epoch 46/100 12/12 [==============================] - 3s 273ms/step - loss: 4.0253e-04 - val_loss: 0.0015 Epoch 47/100 12/12 [==============================] - 3s 286ms/step - loss: 3.5930e-04 - val_loss: 0.0018 Epoch 48/100 12/12 [==============================] - 3s 264ms/step - loss: 3.0690e-04 - val_loss: 0.0016 Epoch 49/100 12/12 [==============================] - 3s 288ms/step - loss: 3.0504e-04 - val_loss: 0.0022 Epoch 50/100 12/12 [==============================] - 3s 277ms/step - loss: 3.1205e-04 - val_loss: 0.0016 Epoch 51/100 12/12 [==============================] - 3s 291ms/step - loss: 2.8386e-04 - val_loss: 0.0014 Epoch 52/100 12/12 [==============================] - 3s 282ms/step - loss: 2.9832e-04 - val_loss: 0.0016 Epoch 53/100 12/12 [==============================] - 3s 287ms/step - loss: 2.8287e-04 - val_loss: 0.0018 Epoch 54/100 12/12 [==============================] - 3s 286ms/step - loss: 2.8193e-04 - val_loss: 0.0013 Epoch 55/100 12/12 [==============================] - 4s 295ms/step - loss: 2.8989e-04 - val_loss: 0.0026 Epoch 56/100 12/12 [==============================] - 3s 262ms/step - loss: 2.7761e-04 - val_loss: 0.0014 Epoch 57/100 12/12 [==============================] - 3s 270ms/step - loss: 2.6088e-04 - val_loss: 0.0016 Epoch 58/100 12/12 [==============================] - 3s 289ms/step - loss: 2.7300e-04 - val_loss: 0.0013 Epoch 59/100 12/12 [==============================] - 3s 288ms/step - loss: 2.6058e-04 - val_loss: 0.0020 Epoch 60/100 12/12 [==============================] - 3s 285ms/step - loss: 2.5682e-04 - val_loss: 0.0014 Epoch 61/100 12/12 [==============================] - 3s 285ms/step - loss: 2.4091e-04 - val_loss: 0.0013 Epoch 62/100 12/12 [==============================] - 4s 296ms/step - loss: 2.2724e-04 - val_loss: 0.0016 Epoch 63/100 12/12 [==============================] - 3s 258ms/step - loss: 2.3206e-04 - val_loss: 0.0012 Epoch 64/100 12/12 [==============================] - 3s 277ms/step - loss: 2.4468e-04 - val_loss: 0.0014 Epoch 65/100 12/12 [==============================] - 3s 266ms/step - loss: 2.2395e-04 - val_loss: 0.0012 Epoch 66/100 12/12 [==============================] - 3s 263ms/step - loss: 2.1142e-04 - val_loss: 0.0012 Epoch 67/100 12/12 [==============================] - 3s 281ms/step - loss: 2.0540e-04 - val_loss: 0.0016 Epoch 68/100 12/12 [==============================] - 4s 297ms/step - loss: 2.0560e-04 - val_loss: 0.0012 Epoch 69/100 12/12 [==============================] - 3s 218ms/step - loss: 1.9982e-04 - val_loss: 0.0014 Epoch 70/100 12/12 [==============================] - 3s 257ms/step - loss: 2.3622e-04 - val_loss: 0.0015 Epoch 71/100 12/12 [==============================] - 3s 283ms/step - loss: 2.6216e-04 - val_loss: 0.0012 Epoch 72/100 12/12 [==============================] - 3s 282ms/step - loss: 2.4869e-04 - val_loss: 0.0017 Epoch 73/100 12/12 [==============================] - 3s 280ms/step - loss: 2.1853e-04 - val_loss: 0.0013 Epoch 74/100 12/12 [==============================] - 3s 244ms/step - loss: 2.2121e-04 - val_loss: 0.0014 Epoch 75/100 12/12 [==============================] - 3s 283ms/step - loss: 1.9690e-04 - val_loss: 0.0011 Epoch 76/100 12/12 [==============================] - 3s 261ms/step - loss: 2.2144e-04 - val_loss: 0.0011 Epoch 77/100 12/12 [==============================] - 3s 282ms/step - loss: 1.8420e-04 - val_loss: 0.0011 Epoch 78/100 12/12 [==============================] - 3s 282ms/step - loss: 1.7841e-04 - val_loss: 0.0014 Epoch 79/100 12/12 [==============================] - 3s 260ms/step - loss: 1.9611e-04 - val_loss: 0.0013 Epoch 80/100 12/12 [==============================] - 3s 281ms/step - loss: 2.0224e-04 - val_loss: 0.0012 Epoch 81/100 12/12 [==============================] - 3s 290ms/step - loss: 2.1049e-04 - val_loss: 0.0020 Epoch 82/100 12/12 [==============================] - 3s 288ms/step - loss: 1.9466e-04 - val_loss: 0.0010 Epoch 83/100 12/12 [==============================] - 3s 284ms/step - loss: 1.5801e-04 - val_loss: 0.0010 Epoch 84/100 12/12 [==============================] - 3s 272ms/step - loss: 1.6260e-04 - val_loss: 9.4397e-04 Epoch 85/100 12/12 [==============================] - 3s 249ms/step - loss: 1.5695e-04 - val_loss: 0.0013 Epoch 86/100 12/12 [==============================] - 3s 242ms/step - loss: 2.0192e-04 - val_loss: 9.7445e-04 Epoch 87/100 12/12 [==============================] - 3s 271ms/step - loss: 2.2179e-04 - val_loss: 0.0020 Epoch 88/100 12/12 [==============================] - 3s 249ms/step - loss: 2.5509e-04 - val_loss: 0.0015 Epoch 89/100 12/12 [==============================] - 3s 261ms/step - loss: 1.9912e-04 - val_loss: 0.0011 Epoch 90/100 12/12 [==============================] - 3s 265ms/step - loss: 1.6930e-04 - val_loss: 8.9285e-04 Epoch 91/100 12/12 [==============================] - 3s 276ms/step - loss: 1.6435e-04 - val_loss: 9.1264e-04 Epoch 92/100 12/12 [==============================] - 3s 259ms/step - loss: 1.6799e-04 - val_loss: 0.0014 Epoch 93/100 12/12 [==============================] - 3s 282ms/step - loss: 1.9593e-04 - val_loss: 0.0016 Epoch 94/100 12/12 [==============================] - 3s 287ms/step - loss: 1.8104e-04 - val_loss: 0.0010 Epoch 95/100 12/12 [==============================] - 3s 277ms/step - loss: 1.3988e-04 - val_loss: 8.5343e-04 Epoch 96/100 12/12 [==============================] - 3s 280ms/step - loss: 1.4097e-04 - val_loss: 9.3255e-04 Epoch 97/100 12/12 [==============================] - 3s 287ms/step - loss: 1.4070e-04 - val_loss: 8.3848e-04 Epoch 98/100 12/12 [==============================] - 3s 290ms/step - loss: 1.3528e-04 - val_loss: 8.4349e-04 Epoch 99/100 12/12 [==============================] - 3s 288ms/step - loss: 1.4087e-04 - val_loss: 9.8092e-04 Epoch 100/100 12/12 [==============================] - 3s 285ms/step - loss: 1.4775e-04 - val_loss: 9.3230e-04
<tensorflow.python.keras.callbacks.History at 0x2d1aa544a58>
train_predict=model.predict(X_train)
test_predict=model.predict(X_test)
train_predict=scaler.inverse_transform(train_predict)
test_predict=scaler.inverse_transform(test_predict)
import math
from sklearn.metrics import mean_squared_error
math.sqrt(mean_squared_error(y_train,train_predict))
140.9909210035748
math.sqrt(mean_squared_error(ytest,test_predict))
235.7193088627771
look_back=100
trainPredictPlot = numpy.empty_like(df1)
trainPredictPlot[:, :] = np.nan
trainPredictPlot[look_back:len(train_predict)+look_back, :] = train_predict
testPredictPlot = numpy.empty_like(df1)
testPredictPlot[:, :] = numpy.nan
testPredictPlot[len(train_predict)+(look_back*2)+1:len(df1)-1, :] = test_predict
plt.plot(scaler.inverse_transform(df1))
plt.plot(trainPredictPlot)
plt.plot(testPredictPlot)
plt.show()
len(test_data)
441
x_input=test_data[341:].reshape(1,-1)
x_input.shape
(1, 100)
temp_input=list(x_input)
temp_input=temp_input[0].tolist()
from numpy import array
lst_output=[]
n_steps=100
i=0
while(i<30):
if(len(temp_input)>100):
#print(temp_input)
x_input=np.array(temp_input[1:])
print("{} day input {}".format(i,x_input))
x_input=x_input.reshape(1,-1)
x_input = x_input.reshape((1, n_steps, 1))
#print(x_input)
yhat = model.predict(x_input, verbose=0)
print("{} day output {}".format(i,yhat))
temp_input.extend(yhat[0].tolist())
temp_input=temp_input[1:]
#print(temp_input)
lst_output.extend(yhat.tolist())
i=i+1
else:
x_input = x_input.reshape((1, n_steps,1))
yhat = model.predict(x_input, verbose=0)
print(yhat[0])
temp_input.extend(yhat[0].tolist())
print(len(temp_input))
lst_output.extend(yhat.tolist())
i=i+1
print(lst_output)
[0.94413203] 101 1 day input [0.8866419 0.87431394 0.88431985 0.87836697 0.8986321 0.92582116 0.92877649 0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203] 1 day output [[0.9379593]] 2 day input [0.87431394 0.88431985 0.87836697 0.8986321 0.92582116 0.92877649 0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931] 2 day output [[0.9286534]] 3 day input [0.88431985 0.87836697 0.8986321 0.92582116 0.92877649 0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342] 3 day output [[0.91987926]] 4 day input [0.87836697 0.8986321 0.92582116 0.92877649 0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926] 4 day output [[0.9128097]] 5 day input [0.8986321 0.92582116 0.92877649 0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973] 5 day output [[0.90777564]] 6 day input [0.92582116 0.92877649 0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564] 6 day output [[0.9047326]] 7 day input [0.92877649 0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258] 7 day output [[0.9033923]] 8 day input [0.95676771 0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231] 8 day output [[0.90332204]] 9 day input [0.93869797 0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204] 9 day output [[0.9040391]] 10 day input [0.93304061 0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908] 10 day output [[0.9050924]] 11 day input [0.94950604 0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242] 11 day output [[0.906118]] 12 day input [0.96424048 0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798] 12 day output [[0.90686554]] 13 day input [0.95512117 0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554] 13 day output [[0.90720606]] 14 day input [0.95989192 0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606] 14 day output [[0.9071163]] 15 day input [0.96635143 0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629] 15 day output [[0.9066538]] 16 day input [0.96246728 0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382] 16 day output [[0.90592706]] 17 day input [0.92295027 0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706] 17 day output [[0.9050646]] 18 day input [0.9598497 0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458] 18 day output [[0.90419257]] 19 day input [0.98792536 0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257] 19 day output [[0.9034131]] 20 day input [0.98594106 0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312] 20 day output [[0.90279734]] 21 day input [0.92531453 0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734] 21 day output [[0.9023812]] 22 day input [0.92172591 0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118] 22 day output [[0.9021694]] 23 day input [0.96474711 0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118 0.90216941] 23 day output [[0.90213937]] 24 day input [0.97572406 0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118 0.90216941 0.90213937] 24 day output [[0.9022528]] 25 day input [0.99159841 0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118 0.90216941 0.90213937 0.90225279] 25 day output [[0.90246403]] 26 day input [0.96972895 0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118 0.90216941 0.90213937 0.90225279 0.90246403] 26 day output [[0.90272856]] 27 day input [0.97614625 0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118 0.90216941 0.90213937 0.90225279 0.90246403 0.90272856] 27 day output [[0.90300757]] 28 day input [0.96795575 1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118 0.90216941 0.90213937 0.90225279 0.90246403 0.90272856 0.90300757] 28 day output [[0.903272]] 29 day input [1. 0.99016297 0.99050072 0.96538039 0.98488559 0.97086887 0.94026007 0.87748037 0.83483915 0.85413324 0.77336823 0.77269273 0.88014017 0.84007431 0.89673225 0.85527316 0.83884995 0.74233725 0.82327113 0.78143207 0.6665963 0.7921557 0.64118044 0.68614371 0.66001013 0.65203074 0.58642236 0.56586169 0.66089673 0.65515494 0.70970193 0.66452757 0.69437642 0.69218104 0.63569197 0.65266402 0.63780292 0.7267162 0.71388162 0.74191506 0.75002111 0.77222832 0.83049059 0.8194292 0.8289707 0.8125475 0.78776492 0.75162543 0.78426074 0.77974331 0.81326522 0.8141096 0.79473106 0.83336148 0.85898843 0.83901883 0.85628641 0.87486279 0.88782403 0.90095415 0.92793211 0.948535 0.93333615 0.91746179 0.92544119 0.91771511 0.9483239 0.94064004 0.96635143 0.9563033 0.96491598 0.94413203 0.93795931 0.92865342 0.91987926 0.91280973 0.90777564 0.90473258 0.90339231 0.90332204 0.90403908 0.90509242 0.90611798 0.90686554 0.90720606 0.90711629 0.90665382 0.90592706 0.90506458 0.90419257 0.90341312 0.90279734 0.90238118 0.90216941 0.90213937 0.90225279 0.90246403 0.90272856 0.90300757 0.90327197] 29 day output [[0.90350425]] [[0.9441320300102234], [0.9379593133926392], [0.9286534190177917], [0.9198792576789856], [0.9128097295761108], [0.9077756404876709], [0.9047325849533081], [0.9033923149108887], [0.9033220410346985], [0.9040390849113464], [0.9050924181938171], [0.9061179757118225], [0.9068655371665955], [0.9072060585021973], [0.9071162939071655], [0.9066538214683533], [0.9059270620346069], [0.905064582824707], [0.9041925668716431], [0.9034131169319153], [0.9027973413467407], [0.902381181716919], [0.902169406414032], [0.9021393656730652], [0.9022527933120728], [0.9024640321731567], [0.9027285575866699], [0.9030075669288635], [0.9032719731330872], [0.9035042524337769]]
day_new=np.arange(1,101)
day_pred=np.arange(101,131)
import matplotlib.pyplot as plt
len(df1)
1258
plt.plot(day_new,scaler.inverse_transform(df1[1158:]))
plt.plot(day_pred,scaler.inverse_transform(lst_output))
[<matplotlib.lines.Line2D at 0x2d1b0f352b0>]
df3=df1.tolist()
df3.extend(lst_output)
plt.plot(df3[1200:])
[<matplotlib.lines.Line2D at 0x2d1b0f55ac8>]
df3=scaler.inverse_transform(df3).tolist()
plt.plot(df3)
[<matplotlib.lines.Line2D at 0x2d1a904c470>]