Reputation: 902
I am a bit puzzled by why dereferencing a &&str
doesn't seem to work in the second case:
use std::collections::HashSet;
fn main() {
let days = vec!["mon", "tue", "wed"];
let mut hs: HashSet<String> = HashSet::new();
for d in &days {
// works
hs.insert(String::from(*d));
// doesn't
hs.insert(*d.to_string());
}
println!("{:#?}", hs);
}
str
does implement a ToString
trait, but it still gives me the error:
error[E0308]: mismatched types
--> src/main.rs:12:19
|
12 | hs.insert(*d.to_string());
| ^^^^^^^^^^^^^^ expected struct `std::string::String`, found str
|
= note: expected type `std::string::String`
found type `str`
What syntax am I getting wrong here?
Upvotes: 2
Views: 87
Reputation: 8466
to_string
is called to d
before it's deref'd, so you will deref the String
, which results in str
.
Change it to
hs.insert(d.to_string());
This works because d
is automatically deref'd to str
, which will be converted into String
afterwards. This is called Deref coercions.
If you have a type
U
, and it implementsDeref<Target=T>
, values of&U
will automatically coerce to a&T
...
Deref will also kick in when calling a method
This is exactly the case here: impl Deref<Target = str> for String
. See here for an example:
A value of type
&&&&&&&&&&&&&&&&Foo
can still have methods defined onFoo
called, because the compiler will insert as many*
operations as necessary to get it right. And since it’s inserting*s
, that usesDeref
.
This example demonstrates this:
struct Foo;
impl Foo {
fn foo(&self) { println!("Foo"); }
}
let f = &&Foo;
// prints "foo"
f.foo();
By the way,
hs.insert((*d).to_string());
will also work, since it's first deref'd to &str
.
Upvotes: 4