The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
Package website: release | dev
This packages provides essential learners for mlr3, maintained by the mlr-org team. Additional learners can be found in the mlr3extralearners package on GitHub. Request additional learners over there.
:point_right: Table of all learners
# CRAN version:
install.packages("mlr3learners")
# Development version:
::install_github("mlr-org/mlr3learners") remotes
If you also want to install all packages of the connected learners,
set dependencies = TRUE
:
# CRAN version:
install.packages("mlr3learners", dependencies = TRUE)
# Development version:
::install_github("mlr-org/mlr3learners", dependencies = TRUE) remotes
ID | Learner | Package |
---|---|---|
classif.cv_glmnet | Penalized Logistic Regression | glmnet |
classif.glmnet | Penalized Logistic Regression | glmnet |
classif.kknn | k-Nearest Neighbors | kknn |
classif.lda | LDA | MASS |
classif.log_reg | Logistic Regression | stats |
classif.multinom | Multinomial log-linear model | nnet |
classif.naive_bayes | Naive Bayes | e1071 |
classif.nnet | Single Layer Neural Network | nnet |
classif.qda | QDA | MASS |
classif.ranger | Random Forest | ranger |
classif.svm | SVM | e1071 |
classif.xgboost | Gradient Boosting | xgboost |
ID | Learner | Package |
---|---|---|
regr.cv_glmnet | Penalized Linear Regression | glmnet |
regr.glmnet | Penalized Linear Regression | glmnet |
regr.kknn | k-Nearest Neighbors | kknn |
regr.km | Kriging | DiceKriging |
regr.lm | Linear Regression | stats |
regr.nnet | Single Layer Neural Network | nnet |
regr.ranger | Random Forest | ranger |
regr.svm | SVM | e1071 |
regr.xgboost | Gradient Boosting | xgboost |
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.