exteral
exteral

Reputation: 1061

How to find c++ source code of torch.bmm of pytorch

I have trouble finding the source code of torch.bmm(), which is defined in https://pytorch.org/cppdocs/api/function_namespaceat_1aac51f71f807ca70fd210814114520c34.html#exhale-function-namespaceat-1aac51f71f807ca70fd210814114520c34.

I have confidence it is located in namespace at, since it is referenced as at::bmm in other places. What I have searched through is :

  1. The directory Aten https://github.com/pytorch/pytorch/tree/34877448216149024f44cbcab830169fdb2fa7fb/aten/src/ATen
  2. The directory of caffe2 https://github.com/pytorch/pytorch/tree/74b65c32be68b15dc7c9e8bb62459efbfbde33d8/caffe2/core
  3. direct search in github with keyword bmm of c++ file

but have found nothing. Is there any systematic way to locate a function(in this case, bmm) in such a large project ?

Upvotes: 3

Views: 1888

Answers (1)

dxiv
dxiv

Reputation: 17658

There is no (single) source for bmm per se. From ATen's Readme:

ATen "native" functions are the modern mechanism for adding operators and functions to ATen (they are "native" in contrast to legacy functions, which are bound via TH/THC cwrap metadata). Native functions are declared in native_functions.yaml and have implementations defined in one of the cpp files in this directory.

bmm is declared in aten\src\ATen\native\native_functions.yaml:

- func: bmm(Tensor self, Tensor mat2) -> Tensor
  use_c10_dispatcher: full
  variants: function, method
  dispatch:
    CPU: bmm_cpu
    CUDA: bmm_cuda
    SparseCPU: bmm_sparse_cpu
    SparseCUDA: bmm_sparse_cuda
  supports_named_tensor: True

The implementations (like bmm_cpu) are to be found in aten\src\ATen\native\LinearAlgebra.cpp.

Upvotes: 3

Related Questions