The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
Univariate models only use the date and target variable values when producing a forecast. They are mostly common on various statistical forecasting models like arima and ets.
Multivariate models leverage many features when producing a forecast, provided as input data before model training. These features can be automatically created using internal feature engineering techniques within the package, or provided as external regressors. Most common machine learning models today, like xgboost and cubist, are multivariate models. An important thing to note is that multivariate models provided in the package can leverage different recipes of feature engineering, that contain different techniques of creating features. These can be identified by seeing the letter “R” followed by a number like “1” or “2”. More info can be found in the feature engineering vignette.
Global models take the entire data set across all individual time series and model them all at once within a single model. Global models are only ran if the input data contains more than one individual time series.
Local models take each individual time series from the input data and model them separately.
Ensemble models are trained on predictions made by individual models. For example, a glmnet ensemble model takes forecasts made by each individual model and feeds them as training data into a glmnet model.
By default within prep_models()
, the
multistep_horizon
argument is set to FALSE. If set to TRUE,
a multistep horizon approach is taken for specific multivariate models
trained on the R1 feature engineering recipe. Below are the models that
can run as multistep.
A multistep model optimizes for each period in a forecast horizon. Let’s take an example of a monthly data set with a forecast horizon of 3. When creating the features for the R1 recipe, finnts will create lags of 1, 2, 3, 6, 9, 12 months. Then when training a multistep model it will iteratively use specific features to train the model. First it will train a model on the first forecast horizon (H1), where it will use all available feature lags. Then for H2 it will use lags of 2 or more. Finally for H3 it will use lags of 3 or more. So the final model is actually a collection of multiple models that each trained on a specific horizon. This lets the model optimize for using all available data when creating the forecast. So in our example, one glmnet model actually has three separate horizon specific models under the hood.
A few more things to mention. If multistep_horizon
is
TRUE then other multivariate models like arima-boost or prophet-xregs
will not run a multistep horizon approach. Instead they will use lags
that are equal to or greater than the forecast horizon. One set of
hyperparameters will be chosen for each multistep model, meaning glmnet
will only use one combination of final hyperparameters and apply it to
each horizon model. Multistep models are not ran for the R2 recipe,
since it has it’s own way of dealing with multiple horizons. Finally if
feature_selection
is turned on, it will be ran for each
horizon specific model, meaning for a 3 month forecast horizon the
feature selection process will be ran 3 times. One for each combination
of features tied to a specific horizon.
Most of the models within the package are built on a fantastic time series library called modeltime, which was built on top tidymodels. Tidymodels is a fantastic series of packages that help in feature engineering (recipes), hyperparameter tuning (tune), model training (parsnip), and back testing (resample). Big shout out to the modeltime and tidymodels teams for being the shoulders this package stands on!
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.