The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.

torch

Lifecycle: experimental Test CRAN status Discord

Installation

torch can be installed from CRAN with:

install.packages("torch")

You can also install the development version with:

remotes::install_github("mlverse/torch")

At the first package load additional software will be installed. See also the full installation guide here.

Examples

You can create torch tensors from R objects with the torch_tensor function and convert them back to R objects with as_array.

library(torch)
x <- array(runif(8), dim = c(2, 2, 2))
y <- torch_tensor(x, dtype = torch_float64())
y
#> torch_tensor
#> (1,.,.) = 
#>   0.6192  0.5800
#>   0.2488  0.3681
#> 
#> (2,.,.) = 
#>   0.0042  0.9206
#>   0.4388  0.5664
#> [ CPUDoubleType{2,2,2} ]
identical(x, as_array(y))
#> [1] TRUE

Simple Autograd Example

In the following snippet we let torch, using the autograd feature, calculate the derivatives:

x <- torch_tensor(1, requires_grad = TRUE)
w <- torch_tensor(2, requires_grad = TRUE)
b <- torch_tensor(3, requires_grad = TRUE)
y <- w * x + b
y$backward()
x$grad
#> torch_tensor
#>  2
#> [ CPUFloatType{1} ]
w$grad
#> torch_tensor
#>  1
#> [ CPUFloatType{1} ]
b$grad
#> torch_tensor
#>  1
#> [ CPUFloatType{1} ]

Contributing

No matter your current skills it’s possible to contribute to torch development. See the contributing guide for more information.

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.