- 22 Jul, 2024 3 commits
- 15 Jul, 2024 2 commits
- 11 Jul, 2024 2 commits
- 09 Jul, 2024 1 commit
-
-
Nabiz authored
Tested different norms, and the k_scaled is the best. Trying to implement now with 9 variables using classes. Also next step to use class for ARIMA and LSTM. LSTM did not work so far...
-
- 08 Jul, 2024 3 commits
- 03 Jul, 2024 2 commits
- 26 Jun, 2024 3 commits
-
-
Nabiz authored
Better performance with lookback = 10 and n_forecast = 25 for ARIMA around RMSE = 6.3% and for LSTM of RMSE = 15%. Perfect aligned peaks for LSTM and +1 lag peak shift for ARIMA
-
Nabiz authored
Full scale analysis for 6 variables and including the lag 1 autoregression variable Zt-1 as predictor. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 12% accuracy, LSTM 2.7%. The issue with ARIMA out of phase is solved by using longer forecast steps n_steps = 25, and also LSTM is now in a better phase, if number of look back parameter increased to k_lookback = 7.
-
Nabiz authored
Full scale analysis for 6 variables and including the lag 1 autoregression variable Zt-1 as predictor. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 10% accuracy, LSTM 7% accuracy down to 3% for longer test sequence. ARIMA is out of phase with a negative time lag....the peak occurs ahead. LSTM predicts correct the shape and phase.
-
- 25 Jun, 2024 1 commit
-
-
Nabiz authored
Full scale analysis for 6 variables. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 10% accuracy, LSTM 7% accuracy down to 3% for shorter test sequence. ARIMA is out of phase with a negative time lag....the peak occurs ahead. LSTM predicts correct the shape and phase.
-
- 22 Jun, 2024 1 commit
-
-
Nabiz authored
-
- 05 Jun, 2024 2 commits
-
-
Nabiz authored
Now with 6 VAR, natural logarithm transform and adding Euler value as shift. Next might be haveing same range of values. Also the RMSE criterion misses the model ARIMA parameters. LSTM is the best, as it reproduces the shape with RMSE= 4%. ARIMA does not reproduce the curve, and RMSE = 8%.
-
Nabiz authored
-
- 31 May, 2024 1 commit
-
-
Nabiz authored
The transformation of Z into Z +2 and logarithmic scaling proves to improve the model for ARIMA and LSTM with RMSE < 2%. But this is a matter of close values. Before calculating the true RMSE, we should re-scale to the original values and calculate the correct RMSE.
-
- 29 May, 2024 1 commit
-
-
Nabiz authored
Now with native values but scaled in order to avoid low numbers and have similar scale of values. Had to add a shift of 2 in order to avoid zero values. ARIMA reaches only 20% RMSE for 7 next step forecast. LSTM reaches 50% RMSE for 15 next steps. For the same next 7 steps LSTM beats the ARIMA with RMSE of 13%. The HYBRID, LSTM + ARIMA gives a mean, that will tend to be in between of them and gives only 18%. A weighted mean RMSE where the weights are the differences of the distance between test values and prediction, might give a better Hybrid, but this Hybrid is not a good approach if one of the predictions is biased and the other one is close to the true. Therefore a weight penalizing the biased forecast might lead to a better approach
-
- 28 May, 2024 1 commit
-
-
Nabiz authored
Reshape all XARRAYS into PANDAS data frames and normalize and use it for ARIMA and LSTM. ARIMA fails due to normalized data sets with RMSE of 100% and not catching the shape. Normalization is not a good step for ARIMA. LSTM predicts perfectly the shape of time series but since the data were normalized already, the rescaling is incorrect and leads to RMSE of 196%. Next step would be to use unnormalized data for ARIMA and LSTM to be able to rescale it to original values and hence realistic RMSE...
-
- 27 May, 2024 1 commit
-
-
Nabiz authored
Normalized the time series with MinMaxScaler. For plotting of subplots it takes 50 Minutes, which is way too long, and have to run it on server. ARIMA gave 100% RMSE for non-scaled series. Now with scaling, but there is an issue with indexes for ARIMA....
-
- 24 May, 2024 1 commit
-
-
Nabiz authored
-
- 23 May, 2024 1 commit
-
-
Nabiz authored
The 7 variables are now loaded and split into train and test data sets. Subplot the time series with labels and units for a given location and level = 6. Have to run it on MN5
-
- 22 May, 2024 1 commit
-
-
Nabiz authored
Starting to apply multivariance time series analysis of the PISCES variables. First loading the XARRAYS....
-
- 09 May, 2024 1 commit
-
-
Nabiz authored
Now applying simple LSTM on PISCES INTDIAC data set. Next to use the Multivariate with Z, A, B, C, D, E, F PISCES variables from the excel table talked with Raffa
-
- 08 May, 2024 4 commits
- 24 Apr, 2024 2 commits
- 19 Apr, 2024 1 commit
-
-
Nabiz authored
By defining the autoregression of Z(t) = Y(t) as predictand and Z(t-1)=Y(t-1),A(t),B(t) as predictors to the synthetic data set to apply ARIMA and LSTM models. The RMSE is in the order of 5% ARIMA and 4% LSTM. With HYBRID model, simple averaging the statistical and NN ML models. Next will be uncertainty of forecast, XGBoost, PROPHET, ForrestClassifier Hybrid weighted average, majority voting model ensemble and runnig on MN5 with 9 Billion parameters
-
- 18 Apr, 2024 2 commits
-
-
Nabiz authored
By defining the autoregression of Z(t) = Z(t-1) + A(t) + B(t) to the synthetic data set to apply ARIMA and LSTM models. The RMSE is in the order of 8% with HYBRID model, simple mean of statistical and NN ML model, gives a better prediction 3%. Next will be uncertainty of forecast, XGBoost, weighted average, majority voting model ensemble. Hyperparameters and process it on MN5
-
Nabiz authored
-
- 10 Apr, 2024 1 commit
-
-
Nabiz authored
Adding some noise to the determenistic series. Including the ARIMA search for RMSE min function to evaluate the best P D Q. Got the Integrated ARIMA(0,2,0) with RMSE of 6% which indicates an I(2) or Differentiation Function of order 2. Doing a manual MinMaxScaling to rescale the values of LTSM, since the RMSE of normalized values are higher in comparison to the original values, with RMSE of 9% for simple LSTM.
-
- 03 Apr, 2024 2 commits