6 Summary


Network science literature has gained popularity to a broad spectrum of areas such as engineering, economics and geography over the last decade. There is a wide range of quantitative forecasting methods, often developed within specific disciplines for specific purposes. Each method has its own properties, accuracies and costs that must be considered when choosing a specific method.

We can highlight some recommendations for each model when dealing with a forecasting problem. Certain aspects of each model could be improved and therefore the strengths and weaknesses of each model must be pointed out.

To improve accuracy results of STLF method, it is essential to perform a fine-tuning process. Each parameter of the model must be adjusted very precisely in order to gain a better accuracy accross different values of lambda. They should be tested for the BoxCox transformations for each time series variable individually. Certain aspects of the whole process are highly computationally expensive. The method could only be ran for a very small number of lags due to the large number of calculations nested within the function.

Even though the applied STLF approach was based on STL+ETS, there are more methods that could be assessed like for example STL+THETAF or any other function defined by the user that makes use of time series. Finally, the uncertainty of the model could be reduced, and its performance could increase if forecasting for fewer steps ahead or by aggrerating the data in larger lags but with the risk of losing the granularity of the data.

Support vector regression offers a good generalization performance largely due to the ability to control two different factors: the error-rate on the training data and the capacity of the learning machine to predict unseen data samples. The generalization ability is indeed a key attraction to SVR, largely due to the structural risk minimization (SRM) principle: a strategy to tackle the trade-off between the confidence interval of the VC-dimension and the value of the error frequency.

Another merit to SVR is the model tuning process that can be streamlined by adaptively resampling methods as k-fold cross-validation. This way, settings that are clearly sub-optimal can be discarded, leading to support vectors that benefits of robustness and the generalization ability to integrate new information into the existing structure of the solution.

On the other hand, some comments can be drawn about the limitations and improvements. Training for large datasets represents a serious limitation for performance, in particular for speed and parameters control. The process of tuning parameters is time-consuming even though benefiting of the versatility of the caret package for R.

For artificial neural network, a good way to approach and improve the outcomes of the methods is by changing the set of parameters such as size, weight decay and a number of hidden layers. However, in this case, we can discard changing the weight decay and the size assuming that the best-tuned parameter for both selected during the trading phase. Also, the increasing of the number of hidden layers must be considered in order to improve the ability to identify more complex patterns. In addition to this, aggregating the data at longer interval of time is also a good strategy to avoid cumulative prediction error. However, one thing to note in these non-parametric methods is that selecting the appropriate number of nodes or interpret their structure can be counter-intuitive which are often referred to as black boxes.

It is important to test the feasibility and quality of each model. The testing process can give important new insights into the nature of the forecasting problem. The testing process is also important because it provides the opportunity to doublecheck the robustness and validity of the method. Also, testing the outcomes against known results helps ensure its structural integrity and validity. After all, we can always come to the conclusion that we need to go back and modify the model formulation and implementation.