The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

shiny.ollama

R shiny interface for chatting with LLMs offline on local with ollama

Chat with LLMs on your machine without internet with complete privacy via ollama, powered by R shiny interface. For more information on ollama, visit https://ollama.com.

Work in Progress - Currently serving a basic release Visitors

⚠️ Disclaimer

Important: The shiny.ollama package requires Ollama to be installed and available on your system. Without Ollama, this package will not function as intended.

To install Ollama, refer to the How to Install Ollama section below.

📦 Installation

To install the shiny.ollama package, follow these steps in R:

# Install devtools if not already installed
install.packages("devtools")

# Install shiny.ollama from GitHub
devtools::install_github("ineelhere/shiny.ollama")

🚀 Usage

Once installed, you can use the shiny.ollama app by running:

library(shiny.ollama)

# Launch the Shiny app
shiny.ollama::run_app()

✨ Features

The shiny.ollama package provides the following features:

Example

Run the following example to launch the app:

library(shiny.ollama)

# Run the Shiny app
shiny.ollama::run_app()

📥 How to Install Ollama

To use this package, ensure Ollama is installed:

  1. Visit the Ollama website and download the installer for your OS.

  2. Run the installer and follow the on-screen steps.

  3. Verify the installation by running this command in your terminal:

    ollama --version

    You should see the version number displayed if the installation was successful.

  4. Pull a model on your local (for example click here) to start with and use it in the app.

📄 License

This project is licensed under the Apache License 2.0.


Collaboration and Feedback are most welcome. More updates to come. Happy coding! 🎉

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.