The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

nlmixr2extra

R-CMD-check Codecov test coverage CRAN version CRAN total downloads CRAN total downloads CodeFactor r-universe

The goal of nlmixr2extra is to provide the tools to help with common pharmacometric tasks with nlmixr2 models like bootstrapping, covariate selection etc.

Installation

You can install the development version of nlmixr2extra from GitHub with:

# install.packages("remotes")
remotes::install_github("nlmixr2/nlmixr2data")
remotes::install_github("nlmixr2/lotri")
remotes::install_github("nlmixr2/rxode2")
remotes::install_github("nlmixr2/nlmixr2est")
remotes::install_github("nlmixr2/nlmixr2extra")

Example of a bootstrapFit()

This is a basic example of bootstrapping provided by this package

library(nlmixr2est)
#> Loading required package: nlmixr2data
library(nlmixr2extra)
# basic example code
# The basic model consists of an ini block that has initial estimates
one.compartment <- function() {
  ini({
    tka <- 0.45 # Log Ka
    tcl <- 1 # Log Cl
    tv <- 3.45    # Log V
    eta.ka ~ 0.6
    eta.cl ~ 0.3
    eta.v ~ 0.1
    add.sd <- 0.7
  })
  # and a model block with the error specification and model specification
  model({
    ka <- exp(tka + eta.ka)
    cl <- exp(tcl + eta.cl)
    v <- exp(tv + eta.v)
    d/dt(depot) = -ka * depot
    d/dt(center) = ka * depot - cl / v * center
    cp = center / v
    cp ~ add(add.sd)
  })
}

# The fit is performed by the function nlmixr/nlmixr2 specifying the model, data
# and estimate (in a real estimate, nBurn and nEm would be much higher.)
fit <- nlmixr2(one.compartment, theo_sd,  est="saem", saemControl(print=0, nBurn = 10, nEm = 20))
#> ℹ parameter labels from comments will be replaced by 'label()'
#> → loading into symengine environment...
#> → pruning branches (`if`/`else`) of saem model...
#> ✔ done
#> → finding duplicate expressions in saem model...
#> [====|====|====|====|====|====|====|====|====|====] 0:00:00
#> → optimizing duplicate expressions in saem model...
#> [====|====|====|====|====|====|====|====|====|====] 0:00:00
#> ✔ done
#> rxode2 2.0.11.9000 using 8 threads (see ?getRxThreads)
#>   no cache: create with `rxCreateCache()`
#> Calculating covariance matrix
#> → loading into symengine environment...
#> → pruning branches (`if`/`else`) of saem model...
#> ✔ done
#> → finding duplicate expressions in saem predOnly model 0...
#> → finding duplicate expressions in saem predOnly model 1...
#> → optimizing duplicate expressions in saem predOnly model 1...
#> → finding duplicate expressions in saem predOnly model 2...
#> ✔ done
#> → Calculating residuals/tables
#> ✔ done
#> → compress origData in nlmixr2 object, save 5952
#> → compress phiM in nlmixr2 object, save 2832
#> → compress parHist in nlmixr2 object, save 1968
#> → compress saem0 in nlmixr2 object, save 24944

# In a real bootstrap, nboot would be much higher.
fit2 <- suppressMessages(bootstrapFit(fit, nboot = 5))
#> 001: 0.289754    0.955016    3.449185    0.381052    0.078862    0.016351    1.435990    
#> 002: 0.341570    1.042356    3.475026    0.361999    0.074919    0.015533    0.968865    
#> 003: 0.472750    1.038382    3.506706    0.343899    0.081378    0.014757    0.797063    
#> 004: 0.496706    1.074803    3.510905    0.326704    0.077309    0.014019    0.722518    
#> 005: 0.534589    1.100823    3.486683    0.310369    0.073930    0.013318    0.689762    
#> 006: 0.536212    1.092222    3.496588    0.294850    0.070233    0.012652    0.646975    
#> 007: 0.498232    1.116233    3.490177    0.296055    0.066722    0.012019    0.657194    
#> 008: 0.472135    1.087746    3.486804    0.281253    0.063386    0.011418    0.665569    
#> 009: 0.414591    1.093383    3.475219    0.275684    0.060216    0.011823    0.654913    
#> 010: 0.463636    1.093226    3.468536    0.302019    0.060954    0.010231    0.644346    
#> 011: 0.471632    1.094277    3.473210    0.300665    0.057946    0.010692    0.645369    
#> 012: 0.487830    1.101393    3.474948    0.306205    0.054575    0.011432    0.636770    
#> 013: 0.479627    1.106742    3.473516    0.303806    0.050805    0.012334    0.635296    
#> 014: 0.488082    1.104715    3.475776    0.299002    0.052067    0.012733    0.636187    
#> 015: 0.491629    1.105114    3.478241    0.293828    0.052474    0.013105    0.633431    
#> 016: 0.496946    1.105220    3.481824    0.288786    0.052556    0.013002    0.633009    
#> 017: 0.501182    1.104867    3.484359    0.291721    0.051772    0.013106    0.630237    
#> 018: 0.504760    1.104384    3.485512    0.294660    0.051390    0.012882    0.628937    
#> 019: 0.503525    1.106547    3.485744    0.294525    0.051059    0.012606    0.628119    
#> 020: 0.504792    1.106269    3.487000    0.292017    0.051074    0.012392    0.628590    
#> 021: 0.504680    1.105623    3.488382    0.289388    0.051105    0.012258    0.629262    
#> 022: 0.506024    1.105160    3.490148    0.285219    0.051702    0.012195    0.629037    
#> 023: 0.511997    1.105357    3.491865    0.290051    0.051408    0.012157    0.629789    
#> 024: 0.515887    1.105808    3.492777    0.292475    0.050884    0.012147    0.630087    
#> 025: 0.517210    1.105963    3.492826    0.295889    0.049986    0.012198    0.629288    
#> 026: 0.516025    1.107520    3.492264    0.297170    0.049610    0.012291    0.628565    
#> 027: 0.515202    1.108956    3.491510    0.296660    0.049518    0.012327    0.628714    
#> 028: 0.516875    1.109118    3.491399    0.297858    0.048998    0.012389    0.628230    
#> 029: 0.518964    1.109633    3.491344    0.299585    0.048763    0.012417    0.628187    
#> 030: 0.520491    1.109555    3.491727    0.299411    0.048814    0.012561    0.627538    
#> 001: 0.322787    0.951634    3.447451    0.381052    0.076738    0.016351    1.530345    
#> 002: 0.427636    0.897920    3.465514    0.361999    0.072901    0.015533    1.087413    
#> 003: 0.508253    0.828660    3.472124    0.343899    0.069256    0.014757    0.838074    
#> 004: 0.509892    0.834282    3.468804    0.326704    0.066873    0.014019    0.673776    
#> 005: 0.508392    0.823533    3.444179    0.310369    0.078466    0.013318    0.654094    
#> 006: 0.461647    0.841721    3.438128    0.294850    0.074543    0.012652    0.607374    
#> 007: 0.449577    0.846887    3.427116    0.328131    0.070816    0.012019    0.613524    
#> 008: 0.452729    0.858913    3.420601    0.316657    0.086343    0.011418    0.600919    
#> 009: 0.423963    0.842949    3.420380    0.329854    0.089491    0.010848    0.594947    
#> 010: 0.428721    0.855938    3.425315    0.354540    0.091239    0.006169    0.606052    
#> 011: 0.453442    0.853622    3.427827    0.351512    0.086620    0.006578    0.608115    
#> 012: 0.462225    0.853917    3.425827    0.361433    0.082071    0.006803    0.605303    
#> 013: 0.458003    0.859284    3.422302    0.375770    0.078209    0.006864    0.609796    
#> 014: 0.463493    0.859918    3.421436    0.394293    0.077441    0.007209    0.612385    
#> 015: 0.461922    0.861117    3.422084    0.391491    0.075832    0.007177    0.610504    
#> 016: 0.463641    0.860191    3.423481    0.388623    0.072947    0.007093    0.610327    
#> 017: 0.466608    0.861250    3.424922    0.387369    0.071245    0.007120    0.609212    
#> 018: 0.467115    0.863905    3.424010    0.385711    0.070812    0.007068    0.607741    
#> 019: 0.464636    0.865482    3.423091    0.384225    0.070733    0.006891    0.605629    
#> 020: 0.466311    0.867835    3.422920    0.388178    0.071830    0.006899    0.603946    
#> 021: 0.467791    0.867667    3.423117    0.388788    0.072470    0.006804    0.603274    
#> 022: 0.467680    0.867848    3.423223    0.389921    0.072747    0.006679    0.601932    
#> 023: 0.471964    0.867613    3.423780    0.392002    0.073413    0.006733    0.601413    
#> 024: 0.471012    0.869247    3.423531    0.391023    0.074191    0.006624    0.600971    
#> 025: 0.471592    0.868571    3.423338    0.394052    0.075357    0.006510    0.600329    
#> 026: 0.470657    0.869562    3.423072    0.395694    0.076004    0.006495    0.600176    
#> 027: 0.469108    0.870772    3.422378    0.395820    0.076448    0.006516    0.600126    
#> 028: 0.467693    0.871474    3.422045    0.393456    0.075655    0.006647    0.599299    
#> 029: 0.468393    0.872718    3.421465    0.391148    0.075150    0.006719    0.599974    
#> 030: 0.468304    0.873833    3.420865    0.390229    0.075320    0.006836    0.599273    
#> 001: 0.321454    0.971377    3.443146    0.381052    0.077198    0.016351    1.843240    
#> 002: 0.404381    0.984655    3.454425    0.361999    0.073338    0.015533    1.327013    
#> 003: 0.524777    0.923516    3.448166    0.343899    0.069671    0.014757    0.973899    
#> 004: 0.464318    0.918715    3.444206    0.326704    0.066188    0.014019    0.899000    
#> 005: 0.505866    0.944304    3.428324    0.369671    0.075402    0.013318    0.849983    
#> 006: 0.463863    0.956021    3.409451    0.351188    0.087264    0.012652    0.825540    
#> 007: 0.428948    0.950593    3.414305    0.333628    0.085637    0.012019    0.835024    
#> 008: 0.421492    0.943008    3.403544    0.316947    0.086564    0.012203    0.836564    
#> 009: 0.393406    0.940308    3.406897    0.301100    0.083656    0.014303    0.847300    
#> 010: 0.441726    0.936843    3.412050    0.258291    0.089442    0.013228    0.846843    
#> 011: 0.463308    0.937482    3.413063    0.248330    0.094996    0.012282    0.848071    
#> 012: 0.453256    0.941789    3.410521    0.245399    0.103340    0.011370    0.838674    
#> 013: 0.440268    0.949914    3.407022    0.233480    0.106164    0.010496    0.837572    
#> 014: 0.443334    0.949030    3.408971    0.243138    0.108268    0.010403    0.840363    
#> 015: 0.447580    0.948484    3.410579    0.244311    0.107791    0.010531    0.837262    
#> 016: 0.446239    0.951883    3.411966    0.243853    0.107251    0.010434    0.835515    
#> 017: 0.446532    0.954138    3.413003    0.246637    0.106016    0.010641    0.836453    
#> 018: 0.446437    0.956413    3.411380    0.244549    0.106120    0.010611    0.836756    
#> 019: 0.440359    0.957282    3.410359    0.246071    0.106063    0.010584    0.837482    
#> 020: 0.438193    0.958720    3.410340    0.248115    0.108699    0.010520    0.836311    
#> 021: 0.438664    0.958757    3.410299    0.247831    0.110066    0.010608    0.834780    
#> 022: 0.438099    0.958202    3.411346    0.251382    0.111559    0.010467    0.834395    
#> 023: 0.442660    0.957814    3.412045    0.250540    0.111833    0.010353    0.835423    
#> 024: 0.445029    0.959311    3.412104    0.252011    0.112183    0.010420    0.836493    
#> 025: 0.445302    0.957116    3.412285    0.252143    0.111216    0.010417    0.837473    
#> 026: 0.445494    0.959271    3.411336    0.254985    0.111211    0.010295    0.836981    
#> 027: 0.442450    0.961509    3.410295    0.255564    0.111479    0.010272    0.836688    
#> 028: 0.442704    0.961873    3.410156    0.257600    0.110936    0.010306    0.835889    
#> 029: 0.443796    0.962325    3.409781    0.257516    0.111795    0.010518    0.835939    
#> 030: 0.444443    0.962669    3.409480    0.259052    0.111135    0.010623    0.836011    
#> 001: 0.308790    0.955292    3.452306    0.381052    0.078745    0.016351    1.610290    
#> 002: 0.320194    0.972929    3.441237    0.361999    0.074808    0.015533    1.052149    
#> 003: 0.388375    0.903049    3.451275    0.343899    0.071067    0.014757    0.876051    
#> 004: 0.352724    0.943163    3.446667    0.326704    0.067514    0.014019    0.848090    
#> 005: 0.370729    0.942207    3.437666    0.310369    0.064138    0.013318    0.811039    
#> 006: 0.340865    0.946849    3.433523    0.294850    0.067188    0.012652    0.778070    
#> 007: 0.321317    0.929801    3.430701    0.280108    0.078825    0.012019    0.807483    
#> 008: 0.297800    0.954520    3.419284    0.266103    0.074883    0.012974    0.812512    
#> 009: 0.268345    0.941798    3.412954    0.252797    0.071139    0.012325    0.820797    
#> 010: 0.284470    0.945655    3.410890    0.167484    0.058407    0.009044    0.819982    
#> 011: 0.304119    0.956041    3.413440    0.160651    0.058828    0.009528    0.823598    
#> 012: 0.310049    0.953562    3.414981    0.158940    0.058257    0.009152    0.821713    
#> 013: 0.294795    0.958102    3.411723    0.154670    0.062969    0.009092    0.826275    
#> 014: 0.295332    0.962803    3.410459    0.150207    0.064431    0.008887    0.830045    
#> 015: 0.299234    0.965921    3.411482    0.147028    0.065119    0.008854    0.828014    
#> 016: 0.296557    0.968780    3.412439    0.142049    0.062697    0.008819    0.829825    
#> 017: 0.301181    0.968624    3.412841    0.141986    0.063694    0.008653    0.829478    
#> 018: 0.302265    0.969492    3.411867    0.141653    0.063770    0.008469    0.831797    
#> 019: 0.297096    0.971173    3.411014    0.139000    0.063844    0.008428    0.832478    
#> 020: 0.295719    0.970786    3.411676    0.136934    0.062973    0.008331    0.832896    
#> 021: 0.296313    0.970269    3.412745    0.133780    0.062369    0.008326    0.832039    
#> 022: 0.295929    0.968986    3.413278    0.131895    0.062932    0.008154    0.831096    
#> 023: 0.300475    0.969702    3.413923    0.132320    0.063767    0.007973    0.830692    
#> 024: 0.305282    0.970974    3.414907    0.132679    0.064361    0.007801    0.829612    
#> 025: 0.305960    0.970655    3.415266    0.134425    0.064468    0.007752    0.828770    
#> 026: 0.305518    0.972429    3.414596    0.133795    0.064520    0.007710    0.829005    
#> 027: 0.305864    0.972931    3.415149    0.133015    0.064204    0.007695    0.829063    
#> 028: 0.305834    0.973216    3.414863    0.132775    0.064447    0.007567    0.828023    
#> 029: 0.307110    0.972875    3.414641    0.134106    0.064937    0.007559    0.827898    
#> 030: 0.308437    0.973293    3.414467    0.134239    0.065239    0.007624    0.827103    
#> 001: 0.250224    0.973350    3.446505    0.381052    0.086079    0.016351    1.629574    
#> 002: 0.238043    0.945447    3.463598    0.361999    0.081775    0.015533    1.148022    
#> 003: 0.302269    0.841802    3.468763    0.343899    0.090469    0.014757    0.857538    
#> 004: 0.243896    0.865882    3.459904    0.326704    0.111023    0.014019    0.756165    
#> 005: 0.282010    0.882191    3.454090    0.310369    0.138965    0.013318    0.732686    
#> 006: 0.222474    0.885134    3.439456    0.294850    0.137408    0.013019    0.659474    
#> 007: 0.205525    0.892145    3.433143    0.280108    0.159956    0.013598    0.644376    
#> 008: 0.196289    0.882056    3.422450    0.266103    0.151958    0.016950    0.625512    
#> 009: 0.154722    0.867262    3.413034    0.252797    0.153009    0.016616    0.622420    
#> 010: 0.161861    0.886435    3.409333    0.179466    0.147778    0.015684    0.616576    
#> 011: 0.178851    0.885066    3.414862    0.190479    0.150189    0.016168    0.613058    
#> 012: 0.192803    0.891881    3.415838    0.197485    0.142894    0.016087    0.612558    
#> 013: 0.179534    0.903459    3.410968    0.187154    0.138015    0.016358    0.610325    
#> 014: 0.186389    0.904350    3.413515    0.184616    0.140652    0.016013    0.611363    
#> 015: 0.191013    0.904880    3.416840    0.180732    0.138288    0.016149    0.607843    
#> 016: 0.194851    0.906830    3.417924    0.179089    0.136730    0.017363    0.605451    
#> 017: 0.196487    0.907656    3.418840    0.179106    0.134246    0.017928    0.602876    
#> 018: 0.199138    0.909915    3.418241    0.180375    0.132986    0.018225    0.601193    
#> 019: 0.195281    0.910217    3.417478    0.181078    0.133218    0.018439    0.601077    
#> 020: 0.193751    0.911645    3.417332    0.179321    0.132558    0.018496    0.600899    
#> 021: 0.195542    0.912633    3.418240    0.178133    0.131680    0.018178    0.600791    
#> 022: 0.196856    0.913167    3.419484    0.176285    0.131776    0.018081    0.600220    
#> 023: 0.201760    0.912215    3.420795    0.178293    0.131133    0.018055    0.599618    
#> 024: 0.206143    0.911332    3.421689    0.180024    0.130477    0.018251    0.598885    
#> 025: 0.208976    0.909441    3.422674    0.181317    0.128259    0.018617    0.598188    
#> 026: 0.209246    0.909773    3.422754    0.182192    0.127371    0.018828    0.597104    
#> 027: 0.210086    0.911391    3.422127    0.185631    0.126914    0.019049    0.596365    
#> 028: 0.210242    0.912131    3.422173    0.187723    0.125802    0.019472    0.595784    
#> 029: 0.211096    0.912247    3.421602    0.191255    0.124943    0.019610    0.595855    
#> 030: 0.211086    0.912450    3.421134    0.193390    0.125181    0.019772    0.595781    
fit2
#> ── nlmixr² SAEM OBJF by FOCEi approximation ──
#> 
#>  Gaussian/Laplacian Likelihoods: AIC(fit) or fit$objf etc. 
#>  FOCEi CWRES & Likelihoods: addCwres(fit) 
#> 
#> ── Time (sec fit$time): ──
#> 
#>         setup covariance saem table compress other
#> elapsed 0.001       7.18  3.2  0.07     0.08 4.129
#> 
#> ── Population Parameters (fit$parFixed or fit$parFixedDf): ──
#> 
#>        Parameter  Est.     SE  %RSE Back-transformed(95%CI) BSV(CV%)
#> tka       Log Ka 0.459  0.127  27.8       1.58 (1.23, 2.03)     70.2
#> tcl       Log Cl  1.01 0.0895  8.85       2.75 (2.31, 3.28)     27.7
#> tv         Log V  3.45  0.034 0.985       31.6 (29.6, 33.8)     13.2
#> add.sd           0.696                                0.696         
#>        Shrink(SD)%
#> tka        -5.46% 
#> tcl        0.387% 
#> tv          13.8% 
#> add.sd            
#>  
#>   Covariance Type (fit$covMethod): boot5
#>     other calculated covs (setCov()): linFim
#>   No correlations in between subject variability (BSV) matrix
#>   Full BSV covariance (fit$omega) or correlation (fit$omegaR; diagonals=SDs) 
#>   Distribution stats (mean/skewness/kurtosis/p-value) available in fit$shrink 
#>   Censoring (fit$censInformation): No censoring
#> 
#> ── Fit Data (object fit is a modified tibble): ──
#> # A tibble: 132 × 19
#>   ID     TIME    DV  PRED    RES IPRED   IRES  IWRES eta.ka eta.cl   eta.v    cp
#>   <fct> <dbl> <dbl> <dbl>  <dbl> <dbl>  <dbl>  <dbl>  <dbl>  <dbl>   <dbl> <dbl>
#> 1 1      0     0.74  0     0.74   0     0.74   1.06   0.143 -0.439 -0.0982  0   
#> 2 1      0.25  2.84  3.27 -0.430  4.06 -1.22  -1.75   0.143 -0.439 -0.0982  4.06
#> 3 1      0.57  6.57  5.85  0.721  7.08 -0.508 -0.731  0.143 -0.439 -0.0982  7.08
#> # … with 129 more rows, and 7 more variables: depot <dbl>, center <dbl>,
#> #   ka <dbl>, cl <dbl>, v <dbl>, tad <dbl>, dosenum <dbl>

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.