The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
The Large Language Model (LLM) represents a groundbreaking advancement in data science and programming, and also allows us to extend the world of R. A seamless interface for integrating the 'OpenAI' Web APIs into R is provided in this package. This package leverages LLM-based AI techniques, enabling efficient knowledge discovery and data analysis (see 'OpenAI' Web APIs details <https://openai.com/blog/openai-api>). The previous functions such as seamless translation and image generation have been moved to other packages 'deepRstudio' and 'stableDiffusion4R'.
Version: | 0.2.10 |
Depends: | R (≥ 4.2.0) |
Imports: | httr, jsonlite, assertthat, clipr, crayon, rstudioapi, future, igraph, deepRstudio, pdftools, xml2, rvest |
Suggests: | testthat, knitr |
Published: | 2023-09-12 |
DOI: | 10.32614/CRAN.package.chatAI4R |
Author: | Satoshi Kume [aut, cre] |
Maintainer: | Satoshi Kume <satoshi.kume.1984 at gmail.com> |
BugReports: | https://github.com/kumeS/chatAI4R/issues |
License: | Artistic-2.0 |
URL: | https://kumes.github.io/chatAI4R/, https://github.com/kumeS/chatAI4R |
NeedsCompilation: | no |
CRAN checks: | chatAI4R results |
Reference manual: | chatAI4R.pdf |
Package source: | chatAI4R_0.2.10.tar.gz |
Windows binaries: | r-devel: chatAI4R_0.2.10.zip, r-release: chatAI4R_0.2.10.zip, r-oldrel: chatAI4R_0.2.10.zip |
macOS binaries: | r-release (arm64): chatAI4R_0.2.10.tgz, r-oldrel (arm64): chatAI4R_0.2.10.tgz, r-release (x86_64): chatAI4R_0.2.10.tgz, r-oldrel (x86_64): chatAI4R_0.2.10.tgz |
Old sources: | chatAI4R archive |
Please use the canonical form https://CRAN.R-project.org/package=chatAI4R to link to this page.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.