GSi
GSi

Reputation: 649

template deduction and explicitly supplied types for parameter packs

I will simplify and shorten this question to facilitate an answer.

The essential is:

why this code is compiled and executed

  # include <iostream>
  template <class A> class Goofy {};

  template <int N, template <class> class B, class A, int ... K, class ... Z>
  void f ( A a, B<A> b, Z ... z )
  {
   std::cout << "I'm executed" << std::endl;
  }

  int main()
  {
   Goofy<int> goofy;
   f<1, Goofy, int, 2, 3, 4>(2,goofy,1.3,'a',1.f);
   }

while the following isn't?

  # include <iostream>
  template <class A> class Goofy {};

  template <int N, template <class> class B, class A, int ... K, class ... Z>
  void f ( A a, B<A> b, Z ... z )
  {
   std::cout << "I'm executed" << std::endl;
  }

  int main()
  {
   Goofy<int> goofy;
   f<1, Goofy, int, 2, 3, 4, double, char, float>(2,goofy,1.3,'a',1.f);
   }

The only difference is the explicit and consistent supply of the types which instantiate the pack Z.

I didn't think that this would end in a compilation error, moreover with a diagnostic telling template deduction/substitution failed when, IMHO, there wouldn't be any need for a deduction.

Can anybody explain this to me?

I used compilers GNU 7.3.1 and clang 4.0.1 and both behave the same, so I fear there is something deeply wrong in my reasoning... but I cannot find what.

Upvotes: 1

Views: 39

Answers (1)

bolov
bolov

Reputation: 75727

Your code can be reduced to:

template <int... N, class T>
auto foo(T) {};

auto test()
{
    foo<1, 2, 3>(4); // OK
    foo<1, 2, 3, int>(4); // ERROR
}

The reason is that variadic arguments are greedy. That's why they need to be last when explicitly stating them.

When you write foo<1, 2, 3>(4);:

  • 1, 2, 3 are matched against int... N -> N deduced to 1, 2, 3 - >OK
  • then the T is deduced from the function argument i.e. 4 to int -> OK

When you write foo<1, 2, 3, int>(4);:

  • 1, 2, 3, int are matched against int... N -> error

Upvotes: 3

Related Questions