1. 22 Jul, 2024 3 commits
  2. 15 Jul, 2024 2 commits
  3. 11 Jul, 2024 2 commits
  4. 09 Jul, 2024 1 commit
  5. 08 Jul, 2024 3 commits
  6. 03 Jul, 2024 2 commits
  7. 26 Jun, 2024 3 commits
    • Nabiz's avatar
      Better performance with lookback = 10 and n_forecast = 25 for ARIMA around... · 0f3c388e
      Nabiz authored
      Better performance with lookback = 10 and n_forecast = 25 for ARIMA around RMSE = 6.3% and for LSTM of RMSE  = 15%. Perfect aligned peaks for LSTM and +1 lag peak shift for ARIMA
      0f3c388e
    • Nabiz's avatar
      Full scale analysis for 6 variables and including the lag 1 autoregression... · e1c318bb
      Nabiz authored
      Full scale analysis for 6 variables and including the lag 1 autoregression variable Zt-1 as predictor. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 12% accuracy, LSTM 2.7%. The issue with ARIMA out of phase is solved by using longer forecast steps n_steps = 25, and also LSTM is now in a better phase, if number of look back parameter increased to k_lookback = 7.
      e1c318bb
    • Nabiz's avatar
      Full scale analysis for 6 variables and including the lag 1 autoregression... · 7c38e6e6
      Nabiz authored
      Full scale analysis for 6 variables and including the lag 1 autoregression variable Zt-1 as predictor. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 10% accuracy, LSTM 7% accuracy down to 3% for longer test sequence. ARIMA is out of phase with a negative time lag....the peak occurs ahead. LSTM predicts correct the shape and phase.
      7c38e6e6
  8. 25 Jun, 2024 1 commit
    • Nabiz's avatar
      Full scale analysis for 6 variables. Transforming the data sets are required... · 1d9fc3bf
      Nabiz authored
      Full scale analysis for 6 variables. Transforming the data sets are required to have similar scale and range for RMSE calculation and prediction. ARIMA 10% accuracy, LSTM 7% accuracy down to 3% for shorter test sequence. ARIMA is out of phase with a negative time lag....the peak occurs ahead. LSTM predicts correct the shape and phase.
      1d9fc3bf
  9. 22 Jun, 2024 1 commit
  10. 05 Jun, 2024 2 commits
  11. 31 May, 2024 1 commit
  12. 29 May, 2024 1 commit
    • Nabiz's avatar
      Now with native values but scaled in order to avoid low numbers and have... · eb065258
      Nabiz authored
      Now with native values but scaled in order to avoid low numbers and have similar scale of values. Had to add a shift of 2 in order to avoid zero values. ARIMA reaches only 20% RMSE for 7 next step forecast. LSTM reaches 50% RMSE for 15 next steps. For the same next 7 steps LSTM beats the ARIMA with RMSE of 13%. The HYBRID, LSTM + ARIMA gives a mean, that will tend to be in between of them and gives only 18%. A weighted mean RMSE where the weights are the differences of the distance between test values and prediction, might give a better Hybrid, but this Hybrid is not a good approach if one of the predictions is biased and the other one is close to the true. Therefore a weight penalizing the biased forecast might lead to a better approach
      eb065258
  13. 28 May, 2024 1 commit
    • Nabiz's avatar
      Reshape all XARRAYS into PANDAS data frames and normalize and use it for ARIMA... · 16cb7086
      Nabiz authored
      Reshape all XARRAYS into PANDAS data frames and normalize and use it for ARIMA and LSTM. ARIMA fails due to normalized data sets with RMSE of 100% and not catching the shape. Normalization is not a good step for ARIMA. LSTM predicts perfectly the shape of time series but since the data were normalized already, the rescaling is incorrect and leads to RMSE of 196%. Next step would be to use unnormalized data for ARIMA and LSTM to be able to rescale it to original values and hence realistic RMSE...
      16cb7086
  14. 27 May, 2024 1 commit
  15. 24 May, 2024 1 commit
  16. 23 May, 2024 1 commit
  17. 22 May, 2024 1 commit
  18. 09 May, 2024 1 commit
  19. 08 May, 2024 4 commits
  20. 24 Apr, 2024 2 commits
  21. 19 Apr, 2024 1 commit
    • Nabiz's avatar
      By defining the autoregression of Z(t) = Y(t) as predictand and... · 8504067d
      Nabiz authored
      By defining the autoregression of Z(t) = Y(t) as predictand and Z(t-1)=Y(t-1),A(t),B(t) as predictors to the synthetic data set to apply ARIMA and LSTM models. The RMSE is in the order of 5% ARIMA and 4% LSTM. With HYBRID model, simple averaging the statistical and NN ML models. Next will be uncertainty of forecast, XGBoost, PROPHET, ForrestClassifier Hybrid weighted average, majority voting model ensemble and runnig on MN5 with 9 Billion parameters
      8504067d
  22. 18 Apr, 2024 2 commits
    • Nabiz's avatar
      By defining the autoregression of Z(t) = Z(t-1) + A(t) + B(t) to the synthetic... · d03ca631
      Nabiz authored
      By defining the autoregression of Z(t) = Z(t-1) + A(t) + B(t) to the synthetic data set to apply ARIMA and LSTM models. The RMSE is in the order of 8% with HYBRID model, simple mean of statistical and NN ML model, gives a better prediction 3%. Next will be uncertainty of forecast, XGBoost, weighted average, majority voting model ensemble. Hyperparameters and process it on MN5
      d03ca631
    • Nabiz's avatar
      Minor updates · 5a98fca8
      Nabiz authored
      5a98fca8
  23. 10 Apr, 2024 1 commit
    • Nabiz's avatar
      Adding some noise to the determenistic series. Including the ARIMA search for... · e9695f20
      Nabiz authored
      Adding some noise to the determenistic series. Including the ARIMA search for RMSE min function to evaluate the best  P D Q. Got the Integrated ARIMA(0,2,0) with RMSE of 6% which indicates an I(2) or Differentiation  Function of order 2. Doing a manual MinMaxScaling to rescale the values of LTSM, since the RMSE of normalized values are higher in comparison to the original values, with RMSE of 9% for simple LSTM.
      e9695f20
  24. 03 Apr, 2024 2 commits