Reputation: 1039
#include <iostream>
#include <string>
using namespace std;
void func(string &&a) { cout << "#1" << endl; }
void func(const string &&a) { cout << "#2" << endl; }
void func(int &&a) { cout << "#3" << endl; }
void func(const int &&a) { cout << "#4" << endl; }
int main()
{
func(string("1")); // call func(string &&)
func((const string)string("1")); // call func(const string &&)
func(1); // call func(int &&)
func((const int)1); // call func(int &&) not func(const int &&)
return 0;
}
From the C++ standard:
Standard conversion sequence S1 is a better conversion sequence than standard conversion sequence S2 if
...
S1 and S2 are reference bindings (8.5.3), and the types to which the references refer are the same type except for top-level cv-qualifiers, and the type to which the reference initialized by S2 refers is more cv-qualified than the type to which the reference initialized by S1 refers.
It seems that the last call doesn't behave as expected. Who can explain it for me?
Upvotes: 2
Views: 166
Reputation: 15823
The type of (const int)1
is adjusted to int
before overload resolution.
If a prvalue initially has the type “cv
T
”, whereT
is a cv-unqualified non-class, non-array type, the type of the expression is adjusted toT
prior to any further analysis.
Upvotes: 1