The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

localLLM: Running Local LLMs with 'llama.cpp' Backend

Provides R bindings to the 'llama.cpp' library for running large language models. The package uses a lightweight architecture where the C++ backend library is downloaded at runtime rather than bundled with the package. Package features include text generation, reproducible generation, and parallel inference.

Version: 1.1.0
Depends: R (≥ 3.6.0)
Imports: Rcpp (≥ 1.0.14), tools, utils, jsonlite, digest, curl, R.utils
LinkingTo: Rcpp
Suggests: testthat (≥ 3.0.0), covr, irr, knitr, rmarkdown
Published: 2025-12-17
DOI: 10.32614/CRAN.package.localLLM
Author: Eddie Yang ORCID iD [aut], Yaosheng Xu ORCID iD [aut, cre]
Maintainer: Yaosheng Xu <xu2009 at purdue.edu>
BugReports: https://github.com/EddieYang211/localLLM/issues
License: MIT + file LICENSE
URL: https://github.com/EddieYang211/localLLM
NeedsCompilation: yes
SystemRequirements: C++17, libcurl (optional, for model downloading)
Materials: README
CRAN checks: localLLM results

Documentation:

Reference manual: localLLM.html , localLLM.pdf
Vignettes: Frequently Asked Questions (source, R code)
Get Started with localLLM (source, R code)
Reproducible Output (source, R code)
Basic Text Generation (source, R code)
Model Comparison & Validation (source, R code)
Ollama Integration (source, R code)
Parallel Processing (source, R code)

Downloads:

Package source: localLLM_1.1.0.tar.gz
Windows binaries: r-devel: localLLM_1.1.0.zip, r-release: localLLM_1.1.0.zip, r-oldrel: localLLM_1.1.0.zip
macOS binaries: r-release (arm64): localLLM_1.1.0.tgz, r-oldrel (arm64): localLLM_1.1.0.tgz, r-release (x86_64): localLLM_1.1.0.tgz, r-oldrel (x86_64): localLLM_1.1.0.tgz
Old sources: localLLM archive

Linking:

Please use the canonical form https://CRAN.R-project.org/package=localLLM to link to this page.

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.