The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
Robust generalized linear models (GLM) using a mixture method, as described in Beath (2018) <doi:10.1080/02664763.2017.1414164>. This assumes that the data are a mixture of standard observations, being a generalised linear model, and outlier observations from an overdispersed generalized linear model. The overdispersed linear model is obtained by including a normally distributed random effect in the linear predictor of the generalized linear model.
Version: | 1.2-4 |
Depends: | R (≥ 3.2.0) |
Imports: | fastGHQuad, stats, bbmle, VGAM, actuar, Rcpp (≥ 0.12.15), methods, boot, numDeriv, parallel, doParallel, foreach, doRNG, MASS |
LinkingTo: | Rcpp |
Suggests: | R.rsp, robustbase, lattice, forward |
Published: | 2024-09-27 |
DOI: | 10.32614/CRAN.package.robmixglm |
Author: | Ken Beath [aut, cre] |
Maintainer: | Ken Beath <ken at kjbeath.id.au> |
Contact: | Ken Beath <ken@kjbeath.id.au> |
License: | GPL-2 | GPL-3 [expanded from: GPL (≥ 2)] |
NeedsCompilation: | yes |
Materials: | NEWS |
CRAN checks: | robmixglm results |
Reference manual: | robmixglm.pdf |
Vignettes: |
robmixglm: An R Package for the Analysis of Robust Generalized Linear Models (source) |
Package source: | robmixglm_1.2-4.tar.gz |
Windows binaries: | r-devel: robmixglm_1.2-4.zip, r-release: robmixglm_1.2-4.zip, r-oldrel: robmixglm_1.2-4.zip |
macOS binaries: | r-release (arm64): robmixglm_1.2-4.tgz, r-oldrel (arm64): robmixglm_1.2-4.tgz, r-release (x86_64): robmixglm_1.2-4.tgz, r-oldrel (x86_64): robmixglm_1.2-4.tgz |
Old sources: | robmixglm archive |
Please use the canonical form https://CRAN.R-project.org/package=robmixglm to link to this page.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.