The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
shiny.ollama
R shiny interface for chatting with LLMs offline on local with ollama
Chat with LLMs on your machine without internet with complete privacy via ollama, powered by R shiny interface. For more information on ollama, visit https://ollama.com.
Important: The shiny.ollama
package
requires Ollama to be installed and available on your
system. Without Ollama, this package will not function as intended.
To install Ollama, refer to the How to Install Ollama section below.
To install the shiny.ollama
package, follow these steps
in R:
# Install devtools if not already installed
install.packages("devtools")
# Install shiny.ollama from GitHub
::install_github("ineelhere/shiny.ollama") devtools
Once installed, you can use the shiny.ollama
app by
running:
library(shiny.ollama)
# Launch the Shiny app
::run_app() shiny.ollama
The shiny.ollama
package provides the following
features:
Run the following example to launch the app:
library(shiny.ollama)
# Run the Shiny app
::run_app() shiny.ollama
To use this package, ensure Ollama is installed:
Visit the Ollama website and
download the installer for your OS.
Run the installer and follow the on-screen steps.
Verify the installation by running this command in your terminal:
ollama --version
You should see the version number displayed if the installation was
successful.
Pull a model on your local (for example click here) to start with and use it in the app.
This project is licensed under the Apache License 2.0.
Collaboration and Feedback are most welcome. More updates to come. Happy coding! 🎉
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.