MR_MPI-BGC
MR_MPI-BGC

Reputation: 285

Defining a Torch Class in R package "torch"

this post is related to my earlier How to define a Python Class which uses R code, but called from rTorch? .

I came across the torch package in R (https://torch.mlverse.org/docs/index.html) which allows to Define a DataSet class definition. Yet, I also need to be able to define a model class like class MyModelClass(torch.nn.Module) in Python. Is this possible in the torch package in R?

When I tried to do it with reticulate it did not work - there were conflicts like

  ImportError: /User/homes/mreichstein/miniconda3/envs/r-torch/lib/python3.6/site-packages/torch/lib/libtorch_python.so: undefined symbol: _ZTINSt6thread6_StateE

It also would not make much sense, since torch isn't wrapping Python.

But it is loosing at lot of flexibility, which rTorch has (but see my problem in the upper post). Thanks for any help! Markus

Upvotes: 4

Views: 537

Answers (1)

Szymon Maszke
Szymon Maszke

Reputation: 24726

You can do that directly using R's torch package which seems quite comprehensive at least for the basic tasks.

Neural networks

Here is an example of how to create nn.Sequential like this:

library(torch)

model <- nn_sequential(
    nn_linear(D_in, H),
    nn_relu(),
    nn_linear(H, D_out)
)

Below is a custom nn_module (a.k.a. torch.nn.Module) which is a simple dense (torch.nn.Linear) layer (source):

library(torch)

# creates example tensors. x requires_grad = TRUE tells that 
# we are going to take derivatives over it.
dense <- nn_module(
  clasname = "dense",
  # the initialize function tuns whenever we instantiate the model
  initialize = function(in_features, out_features) {
    
    # just for you to see when this function is called
    cat("Calling initialize!") 
    
    # we use nn_parameter to indicate that those tensors are special
    # and should be treated as parameters by `nn_module`.
    self$w <- nn_parameter(torch_randn(in_features, out_features))
    self$b <- nn_parameter(torch_zeros(out_features))
    
  },
  # this function is called whenever we call our model on input.
  forward = function(x) {
    cat("Calling forward!")
    torch_mm(x, self$w) + self$b
  }
)

model <- dense(3, 1)

Another example, using torch.nn.Linear layers to create a neural network this time (source):

two_layer_net <- nn_module(
   "two_layer_net",
   initialize = function(D_in, H, D_out) {
      self$linear1 <- nn_linear(D_in, H)
      self$linear2 <- nn_linear(H, D_out)
   },
   forward = function(x) {
      x %>% 
         self$linear1() %>% 
         nnf_relu() %>% 
         self$linear2()
   }
)

Also there are other resources like here (using flow control and weight sharing).

Other

Looking at the reference it seems most of the layers are already provided (didn't notice transformer layers at a quick glance, but this is minor).

As far as I can tell basic blocks for neural networks, their training etc. are in-place (even JIT so sharing between languages should be possible).

Upvotes: 3

Related Questions