The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

Bayesian Function-on-Scalar Regression with refundBayes::fosr_bayes

2026-03-03

Introduction

This vignette provides a detailed guide to the fosr_bayes() function in the refundBayes package, which fits Bayesian Function-on-Scalar Regression (FoSR) models using Stan. In contrast to scalar-on-function regression (SoFR), where the outcome is scalar and the predictors include functional variables, FoSR reverses this relationship: the outcome is a functional variable (a curve observed over a continuum) and the predictors are scalar.

The methodology follows the tutorial by Jiang, Crainiceanu, and Cui (2025), Tutorial on Bayesian Functional Regression Using Stan, published in Statistics in Medicine. The procedure for fitting a generalized multilevel Bayesian FoSR was introduced by Goldsmith, Zipunnikov, and Schrack (2015).

Install the refundBayes Package

The refundBayes package can be installed from GitHub:

library(remotes)
remotes::install_github("https://github.com/ZirenJiang/refundBayes")

Statistical Model

The FoSR Model

Function-on-Scalar Regression (FoSR) models the relationship between a functional outcome and one or more scalar predictors. Two key differences from traditional regression are: (1) the outcome \(Y_i(t)\) is multivariate and often high-dimensional, and (2) the residuals \(e_i(t)\) are correlated across \(t\) (points in the domain).

For subject \(i = 1, \ldots, n\), let \(Y_i(t)\) be the functional response observed at time points \(t = t_1, \ldots, t_M\) over a domain \(\mathcal{T}\). The domain \(\mathcal{T}\) is not restricted to \([0,1]\); it is determined by the actual time points in the data (e.g., \(\mathcal{T} = [1, 1440]\) for minute-level 24-hour data). The FoSR model assumes:

\[Y_i(t) = \sum_{p=1}^P X_{ip} \beta_p(t) + e_i(t)\]

where:

Each functional coefficient \(\beta_p(t)\) describes how the \(p\)-th scalar predictor affects the outcome at each point \(t\) in the domain. This allows the effect of a scalar predictor to vary smoothly over the domain.

Modeling the Residual Structure

To account for the within-subject correlation in the residuals, the model decomposes \(e_i(t)\) using functional principal components:

\[e_i(t) = \sum_{j=1}^J \xi_{ij} \phi_j(t) + \epsilon_i(t)\]

where \(\phi_1(t), \ldots, \phi_J(t)\) are the eigenfunctions estimated via FPCA (using refund::fpca.face), \(\xi_{ij}\) are the subject-specific scores with \(\xi_{ij} \sim N(0, \lambda_j)\), and \(\epsilon_i(t) \sim N(0, \sigma_\epsilon^2)\) is independent measurement error. The eigenvalues \(\lambda_j\) and the error variance \(\sigma_\epsilon^2\) are estimated from the data.

Functional Coefficients via Penalized Splines

Each functional coefficient \(\beta_p(t)\) is represented using \(K\) spline basis functions \(\psi_1(t), \ldots, \psi_K(t)\):

\[\beta_p(t) = \sum_{k=1}^K b_{pk} \psi_k(t)\]

Substituting into the model:

\[Y_i(t) = \sum_{k=1}^K \left(\sum_{p=1}^P X_{ip} b_{pk}\right) \psi_k(t) + \sum_{j=1}^J \xi_{ij} \phi_j(t) + \epsilon_i(t)\]

Let \(B_{ik} = \sum_{p=1}^P X_{ip} b_{pk}\) and denote by \(\boldsymbol{\Psi}\) the \(M \times K\) matrix of spline basis evaluations, by \(\boldsymbol{\Phi}\) the \(M \times J\) matrix of eigenfunctions, and by \(\mathbf{B}_i = (B_{i1}, \ldots, B_{iK})^t\). The model in matrix form becomes:

\[\mathbf{Y}_i = \boldsymbol{\Psi} \mathbf{B}_i + \boldsymbol{\Phi} \boldsymbol{\xi}_i + \boldsymbol{\epsilon}_i\]

where \(\mathbf{Y}_i = \{Y_i(t_1), \ldots, Y_i(t_M)\}^t\) is the \(M \times 1\) vector of observed functional data for subject \(i\).

Smoothness Penalty

Smoothness of each \(\beta_p(t)\) is induced through a quadratic penalty on the spline coefficients \(\mathbf{b}_p = (b_{p1}, \ldots, b_{pK})^t\):

\[p(\mathbf{b}_p) \propto \exp\left(-\frac{\mathbf{b}_p^t \mathbf{S} \mathbf{b}_p}{2\sigma_p^2}\right)\]

where \(\mathbf{S}\) is the penalty matrix and \(\sigma_p^2\) is the smoothing parameter for the \(p\)-th functional coefficient. Importantly, each functional coefficient \(\beta_p(t)\) has its own smoothing parameter \(\sigma_p^2\), allowing different levels of smoothness across predictors.

Unlike the SoFR and FCox models, the FoSR implementation uses the original spline basis without the spectral reparametrisation. This choice follows Goldsmith et al. (2015) and simplifies interpretation by avoiding the need to back-transform the estimated coefficients.

Full Bayesian Model

The complete Bayesian FoSR model is:

\[\begin{cases} \mathbf{Y}_i = \boldsymbol{\Psi} \mathbf{B}_i + \boldsymbol{\Phi} \boldsymbol{\xi}_i + \boldsymbol{\epsilon}_i \\ B_{ik} = \sum_{p=1}^P X_{ip} b_{pk}, \quad k = 1, \ldots, K \\ p(\mathbf{b}_p) \propto \exp(-\mathbf{b}_p^t \mathbf{S} \mathbf{b}_p / 2\sigma_p^2), \quad p = 1, \ldots, P \\ \xi_{ij} \sim N(0, \lambda_j), \quad \lambda_j \sim p(\lambda_j), \quad j = 1, \ldots, J \\ \epsilon_i(t_m) \sim N(0, \sigma_\epsilon^2), \quad i = 1, \ldots, n, \; t = t_1, \ldots, t_M \\ \sigma_\epsilon^2 \sim p(\sigma_\epsilon^2), \quad \sigma_p^2 \sim p(\sigma_p^2), \quad p = 1, \ldots, P \end{cases}\]

The priors \(p(\lambda_j)\), \(p(\sigma_\epsilon^2)\), and \(p(\sigma_p^2)\) are non-informative priors on variance components, using inverse-Gamma priors \(IG(0.001, 0.001)\).

The model assumes that the functional responses are observed on a common set of grid points across all subjects. This ensures that the \(\boldsymbol{\Psi}\) and \(\boldsymbol{\Phi}\) matrices are identical across subjects, simplifying computation in Stan.

The fosr_bayes() Function

Usage

fosr_bayes(
  formula,
  data,
  joint_FPCA = NULL,
  runStan = TRUE,
  niter = 3000,
  nwarmup = 1000,
  nchain = 3,
  ncores = 1,
  spline_type = "bs",
  spline_df = 10
)

Arguments

Argument Description
formula Functional regression formula. The left-hand side should be the name of the functional response variable (an \(n \times M\) matrix in data). The right-hand side specifies scalar predictors using standard formula syntax.
data A data frame containing all variables used in the model. The functional response should be stored as an \(n \times M\) matrix, where each row is one subject and each column is one time point.
joint_FPCA A logical (TRUE/FALSE) vector of the same length as the number of functional predictors, indicating whether to jointly model FPCA. Default is NULL, which sets all entries to FALSE.
runStan Logical. Whether to run the Stan program. If FALSE, the function only generates the Stan code and data without sampling. Default is TRUE.
niter Total number of Bayesian posterior sampling iterations (including warmup). Default is 3000.
nwarmup Number of warmup (burn-in) iterations. Default is 1000.
nchain Number of Markov chains for posterior sampling. Default is 3.
ncores Number of CPU cores to use when executing the chains in parallel. Default is 1.
spline_type Type of spline basis used for the functional coefficients. Default is "bs" (B-splines).
spline_df Number of degrees of freedom (basis functions) for the spline basis. Default is 10.

Return Value

The function returns a list of class "refundBayes" containing the following elements:

Element Description
stanfit The Stan fit object (class stanfit). Can be used for convergence diagnostics, traceplots, and additional summaries via the rstan package.
spline_basis Basis functions used to reconstruct the functional coefficients from the posterior samples.
stancode A character string containing the generated Stan model code.
standata A list containing the data passed to the Stan model.
func_coef An array of posterior samples for functional coefficients, with dimensions \(Q \times P \times M\), where \(Q\) is the number of posterior samples, \(P\) is the number of scalar predictors, and \(M\) is the number of time points on the functional domain.
family The family type: "functional".

Formula Syntax

The formula syntax for FoSR differs from SoFR and FCox because the outcome is functional and the predictors are scalar. The left-hand side specifies the functional response, and the right-hand side lists scalar predictors:

y ~ X

where:

Note that the FoSR formula does not use the s() term with tmat, lmat, and wmat that appears in SoFR and FCox models. This is because in FoSR, the functional variable is the outcome (not a predictor), and the spline basis for the functional coefficients is constructed internally using the spline_type and spline_df arguments.

Example: Bayesian FoSR

We demonstrate the fosr_bayes() function using a simulated example dataset with a functional response and one scalar predictor.

Load and Prepare Data

## Load the example data
FoSR_exp_data <- readRDS("data/example_data_FoSR.rds")

## Set the functional response
FoSR_exp_data$y <- FoSR_exp_data$MIMS

The example dataset contains:

Fit the Bayesian FoSR Model

library(refundBayes)

refundBayes_FoSR <- refundBayes::fosr_bayes(
  y ~ X,
  data = FoSR_exp_data,
  runStan = TRUE,
  niter = 1500,
  nwarmup = 500,
  nchain = 1,
  ncores = 1
)

In this call:

A Note on Computation

FoSR models are computationally more demanding than SoFR models because the likelihood involves all \(M\) time points for each of the \(n\) subjects. The example above uses a single chain for demonstration purposes. In practice, consider the trade-off between the number of basis functions, the number of FPCA components, and the computational budget.

Visualization

The plot() method for refundBayes objects displays the estimated functional coefficients with pointwise 95% credible intervals:

library(ggplot2)
plot(refundBayes_FoSR)

Extracting Posterior Summaries

Posterior summaries of the functional coefficients can be computed from the func_coef element. The func_coef object is an array with dimensions \(Q \times P \times M\), where \(Q\) is the number of posterior samples, \(P\) is the number of scalar predictors (including the intercept), and \(M\) is the number of time points:

## Posterior mean of the functional coefficient for the first scalar predictor
mean_curve <- apply(refundBayes_FoSR$func_coef[, 1, ], 2, mean)

## Pointwise 95% credible interval
upper_curve <- apply(refundBayes_FoSR$func_coef[, 1, ], 2,
                     function(x) quantile(x, prob = 0.975))
lower_curve <- apply(refundBayes_FoSR$func_coef[, 1, ], 2,
                     function(x) quantile(x, prob = 0.025))

Comparison with Frequentist Results

The Bayesian results can be compared with frequentist estimates obtained via mgcv::gam. As described in Crainiceanu et al. (2024), one key distinction is that the frequentist approach fits the model at each time point independently or using fast algorithms, while the Bayesian approach jointly models all time points with a shared smoothness prior and FPCA-based residual structure:

library(refund)

## The frequentist approach can be implemented using mgcv or refund::pffr
## See Crainiceanu et al. (2024) for details

fit.freq = pffr(y~-1+X,data=FoSR_exp_data,bs.yindex=list(bs="cc", k=10))
plotfot = plot(fit.freq)

Simulation studies in Jiang et al. (2025) show that the Bayesian FoSR achieves similar estimation accuracy (RISE) to the frequentist approach, while providing superior coverage of pointwise credible intervals.

Inspecting the Generated Stan Code

Setting runStan = FALSE allows you to inspect or modify the Stan code before running the model:

## Generate Stan code without running the sampler
fosr_code <- refundBayes::fosr_bayes(
  y ~ X,
  data = FoSR_exp_data,
  runStan = FALSE
)

## Print the generated Stan code
cat(fosr_code$stancode)

Practical Recommendations

References

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.