1. 08 Aug, 2024 1 commit
  2. 02 Aug, 2024 2 commits
  3. 01 Aug, 2024 4 commits
  4. 31 Jul, 2024 5 commits
  5. 29 Jul, 2024 1 commit
  6. 26 Jul, 2024 1 commit
  7. 23 Jul, 2024 1 commit
  8. 22 Jul, 2024 3 commits
  9. 15 Jul, 2024 2 commits
  10. 11 Jul, 2024 2 commits
  11. 09 Jul, 2024 1 commit
  12. 08 Jul, 2024 3 commits
  13. 03 Jul, 2024 2 commits
  14. 26 Jun, 2024 3 commits
    • Nabiz's avatar
      Better performance with lookback = 10 and n_forecast = 25 for ARIMA around... · 0f3c388e
      Nabiz authored
      Better performance with lookback = 10 and n_forecast = 25 for ARIMA around RMSE = 6.3% and for LSTM of RMSE  = 15%. Perfect aligned peaks for LSTM and +1 lag peak shift for ARIMA
      0f3c388e
    • Nabiz's avatar
      Full scale analysis for 6 variables and including the lag 1 autoregression... · e1c318bb
      Nabiz authored
      Full scale analysis for 6 variables and including the lag 1 autoregression variable Zt-1 as predictor. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 12% accuracy, LSTM 2.7%. The issue with ARIMA out of phase is solved by using longer forecast steps n_steps = 25, and also LSTM is now in a better phase, if number of look back parameter increased to k_lookback = 7.
      e1c318bb
    • Nabiz's avatar
      Full scale analysis for 6 variables and including the lag 1 autoregression... · 7c38e6e6
      Nabiz authored
      Full scale analysis for 6 variables and including the lag 1 autoregression variable Zt-1 as predictor. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 10% accuracy, LSTM 7% accuracy down to 3% for longer test sequence. ARIMA is out of phase with a negative time lag....the peak occurs ahead. LSTM predicts correct the shape and phase.
      7c38e6e6
  15. 25 Jun, 2024 1 commit
    • Nabiz's avatar
      Full scale analysis for 6 variables. Transforming the data sets are required... · 1d9fc3bf
      Nabiz authored
      Full scale analysis for 6 variables. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 10% accuracy, LSTM 7% accuracy down to 3% for shorter test sequence. ARIMA is out of phase with a negative time lag....the peak occurs ahead. LSTM predicts correct the shape and phase.
      1d9fc3bf
  16. 22 Jun, 2024 1 commit
  17. 05 Jun, 2024 2 commits
  18. 31 May, 2024 1 commit
  19. 29 May, 2024 1 commit
    • Nabiz's avatar
      Now with native values but scaled in order to avoid low numbers and have... · eb065258
      Nabiz authored
      Now with native values but scaled in order to avoid low numbers and have similar scale of values. Had to add a shift of 2 in order to avoid zero values. ARIMA reaches only 20% RMSE for 7 next step forecast. LSTM reaches 50% RMSE for 15 next steps. For the same next 7 steps LSTM beats the ARIMA with RMSE of 13%. The HYBRID, LSTM + ARIMA gives a mean, that will tend to be in between of them and gives only 18%. A weighted mean RMSE where the weights are the differences of the distance between test values and prediction, might give a better Hybrid, but this Hybrid is not a good approach if one of the predictions is biased and the other one is close to the true. Therefore a weight penalizing the biased forecast might lead to a better approach
      eb065258
  20. 28 May, 2024 1 commit
    • Nabiz's avatar
      Reshape all XARRAYS into PANDAS data frames and normalize and use it for ARIMA... · 16cb7086
      Nabiz authored
      Reshape all XARRAYS into PANDAS data frames and normalize and use it for ARIMA and LSTM. ARIMA fails due to normalized data sets with RMSE of 100% and not catching the shape. Normalization is not a good step for ARIMA. LSTM predicts perfectly the shape of time series but since the data were normalized already, the rescaling is incorrect and leads to RMSE of 196%. Next step would be to use unnormalized data for ARIMA and LSTM to be able to rescale it to original values and hence realistic RMSE...
      16cb7086
  21. 27 May, 2024 1 commit
  22. 24 May, 2024 1 commit