The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
shiny.ollamaR Shiny Interface for Chatting with LLMs Offline via Ollama
Experience seamless, private, and offline AI conversations right
on your machine! shiny.ollama provides a user-friendly R
Shiny interface to interact with LLMs locally, powered by Ollama.
Important: shiny.ollama requires Ollama
to be installed on your system. Without it, this package will not
function. Follow the Installation
Guide below to set up Ollama first.
install.packages("shiny.ollama")# Install devtools if not already installed
install.packages("devtools")
devtools::install_github("ineelhere/shiny.ollama")Launch the Shiny app in R with:
library(shiny.ollama)
# Start the application
shiny.ollama::run_app()To use this package, install Ollama first:
ollama --versionIf successful, the version number will be displayed.
This R package is an independent, passion-driven open source
initiative, released under the
Apache License 2.0. It is not affiliated
with, owned by, funded by, or influenced by any external organization.
The project is dedicated to fostering a community of developers who
share a love for coding and collaborative innovation.
Contributions, feedback, and feature requests are always welcome!
Stay tuned for more updates. 🚀
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.