pseudomarvin
pseudomarvin

Reputation: 1627

Have a template method but not expose implementation

I have a function in TFRuntime.h

class TFRuntime {
...
    template <typename T>
    Status computeXYSlice(Volume<T>* input, int zCoord, Volume<T> *output);
...
}

TFRuntime.cpp includes tensorflow library headers such as

#include <tensorflow/cc/ops/standard_ops.h>
#include <tensorflow/cc/saved_model/loader.h>

I do not want to make these includes in the header since that would force anyone using TFRuntime to include them as well. However if i want the computeXYSlice function to allow any type, I have to include the implementation in the .h file. The implementation however requires the above mentioned tensorflow headers.

How do I get around this problem? Could I explicitly 'instantiate' only certain variants of the computeXYSlice function? E.g., where T is float or int or double? Or is there a better way?

Upvotes: 6

Views: 1502

Answers (3)

Could I explicitly 'instantiate' only certain variants of the computeXYSlice function? E.g., where T is float or int or double?

You may, and their implementation need not be in the header. I'll get to that in a moment. But if you really want to allow any type then your template must be in a header. That's just how it is.

If you wish to support only a small set of types, as template instantiations, without overloading (can make a difference sometimes when doing lookup), the standard has a mechanism for explicit template instantiation.

Your header will look something like this...

class TFRuntime {
public:
    template <typename T>
    Status computeXYSlice(Volume<T>* input, int zCoord, Volume<T> *output);
};

... And your implementation file, will contain the explicit instatiation definitions, like so...

template <typename T>
Status TFRuntime::computeXYSlice(Volume<T>* input, int zCoord, Volume<T> *output) {
  // Implement it
}

template
Status TFRuntime::computeXYSlice(Volume<int>*, int, Volume<int>*);

template
Status TFRuntime::computeXYSlice(Volume<double>*, int, Volume<double>*);

You have to include the explicit instantiation definitions, otherwise your program is ill-formed, no diagnostic required. The template function must be defined when implicit instantiation occurs, unless an explicit instantiation appears somewhere.

This is a bit cumbersome. But if your end goal is to indeed have a bunch of instantiations (as opposed to overloads), that's how you tie it all together.

Upvotes: 5

Aconcagua
Aconcagua

Reputation: 25526

I personally would just stay with the public template, sure, you'll be forced then to deliver the included headers together with your own ones, but I'd just byte the bullet in this case for leaving most possible flexibility for users of your class.

You might hide that away, at least, in a separate ".inl" or ".impl" file (doesn't solve in the sense you asked, but makes it less visible):

class TFRuntime
{
    template <typename T>
    Status computeXYSlice(Volume<T>* input, int zCoord, Volume<T>* output);
};

#include "TFRuntime.inl"

and:

#include <tensorflow/cc/ops/standard_ops.h>
#include <tensorflow/cc/saved_model/loader.h>

template <typename T>
Status TFRuntime::computeXYSlice(Volume<T>* input, int zCoord, Volume<T>* output)
{
    // ...
}

If you really want to limit the the range of data types being usable I'd do it via overloads:

class TFRuntime
{
public:
    Status computeXYSlice(Volume<int>* input, int zCoord, Volume<int>* output);
    Status computeXYSlice(Volume<double>* input, int zCoord, Volume<double>* output);
    Status computeXYSlice(Volume<unsigned long>* input, int zCoord, Volume<unsigned long>* output);
private:
    template <typename T>
    Status computeXYSlice(Volume<T>* input, int zCoord, Volume<T>* output);
};

As now the template function is private, it cannot be called from "outside", and you can safely implement it in the .cpp file (where only the required specialisations will be instantiated), together with the normal overloaded functions calling the template function (you need to provide the template parameter explicitly to prevent recursion – or give the template function a different name). Do not implement the overloads already in the class definition, otherwise they get inlined and you expose the template function again to other classes, requiring the implementation being available again...

Upvotes: 2

Daniel Langr
Daniel Langr

Reputation: 23497

You can wrap the used (non-templated) Tensorflow functionality in your own header/sources file and call your wrappers from your templated code:

// wrapper.h:
void some_function();

// wrapper.cpp:
#include <tensorflow/...>
void some_function() { /* use tensorflow stuff here */ }

// TFRuntime.h:
#include "wrapper.h" // no inclusion of Tensorflow headers involved

template <typename T>
void some_templated_function() { 
    some_function();
}

Live demo: https://wandbox.org/permlink/dWRT0AEi8alylTQB

However, this solution adds code redundancy and could stop working if Tensorflow API gets changed.

Upvotes: 3

Related Questions