Reputation: 13
Given the struct Foo
containing a collection of elements:
#[derive(Debug)]
struct Foo {
bar: Vec<i8>,
}
I have written a mutable view object intended to encapsulate a part of Foo
:
#[derive(Debug)]
struct View<'a> {
foo: &'a mut Foo,
}
impl<'a> View<'a> {
fn iter(&'a self) -> std::slice::Iter<'a, i8> {
self.foo.bar.iter()
}
fn iter_mut(&'a mut self) -> std::slice::IterMut<'a, i8> {
self.foo.bar.iter_mut()
}
fn mutate(&'a mut self) {
let mut vector: Vec<i8> = vec![];
for value in self.iter().take(1).cloned() {
vector.push(value);
}
for value in self.iter_mut() {
*value = 0;
}
}
}
The View
struct above works as intended and the following code prints Foo { bar: [0, 0, 0] }
.
fn main() {
let mut foo = Foo { bar: vec![0, 1, 2] };
let mut view = View { foo: &mut foo };
view.mutate();
println!("{:?}", foo);
}
However, different kind of views should be possible — if Foo
was a matrix, views could be rows, columns, or even submatrices. I have thus rewritten the View
as a trait implemented by a struct, and given mutate
a default implementation:
trait AbstractView<'a> {
type Iterator: Iterator<Item = &'a i8>;
type IteratorMut: Iterator<Item = &'a mut i8>;
fn iter(&'a self) -> Self::Iterator;
fn iter_mut(&'a mut self) -> Self::IteratorMut;
fn mutate(&'a mut self) {
let mut vector: Vec<i8> = vec![];
for value in self.iter().take(1).cloned() {
vector.push(value);
}
for value in self.iter_mut() {
*value = vector[0];
}
}
}
#[derive(Debug)]
struct View<'a> {
foo: &'a mut Foo,
}
impl<'a> AbstractView<'a> for View<'a> {
type Iterator = std::slice::Iter<'a, i8>;
type IteratorMut = std::slice::IterMut<'a, i8>;
fn iter(&'a self) -> Self::Iterator {
self.foo.bar.iter()
}
fn iter_mut(&'a mut self) -> Self::IteratorMut {
self.foo.bar.iter_mut()
}
}
This code does not compile succesfully, rustc complains about the call to iter_mut
in mutate
:
error[E0502]: cannot borrow `*self` as mutable because it is also borrowed as immutable
--> src/main.rs:18:22
|
6 | trait AbstractView<'a> {
| -- lifetime `'a` defined here
...
15 | for value in self.iter().take(1).cloned() {
| -----------
| |
| immutable borrow occurs here
| argument requires that `*self` is borrowed for `'a`
...
18 | for value in self.iter_mut() {
| ^^^^^^^^^^^^^^^ mutable borrow occurs here
Why does implementing mutate
as a default method on the trait cause what looks like a different behaviour from the borrow checker? How can I get this trait to work?
Using rustc version 1.43.1.
Upvotes: 1
Views: 121
Reputation: 13
Thanks to SCappella's answer, I have understood and fixed my issue. Since I'd rather have a clean code base rather than an efficient one for my use case, I have replaced the iterators with vectors:
trait AbstractView {
fn refs(&self) -> Vec<&i8>;
fn refs_mut(&mut self) -> Vec<&mut i8>;
fn mutate(&mut self) {
let mut vector: Vec<i8> = vec![];
for value in self.refs().iter().take(1) {
vector.push(**value);
}
for value in self.refs_mut() {
*value = vector[0];
}
}
}
impl AbstractView for View<'_>
{
fn refs(&self) -> Vec<&i8> {
self.foo.bar.iter().collect()
}
fn refs_mut(&mut self) -> Vec<&mut i8> {
self.foo.bar.iter_mut().collect()
}
}
This lets me not duplicate the mutate
method.
Upvotes: 0
Reputation: 10424
It's easy to tell why the trait-based version doesn't work, but harder to say why the original does work.
It's all in the lifetimes. For the trait-based version, there's only a single lifetime 'a
everywhere. When we call self.iter()
or self.iter_mut()
, the borrow lasts for that same lifetime. That means we can't call both: if we call both, the immutable and mutable borrows have the same lifetime, so they exist simultaneously.
This raises the question of why the non-trait version works. Doesn't it do the exact same thing? The answer lies in the variance of the types std::slice::Iter<'a, T>
and std::slice::IterMut<'a, T>
. Variance of a generic type T<'a>
is if and how T<'a>
can be coerced to T<'b>
when 'a
and 'b
are related.
For many types, this relationship is covariant: if 'a
is longer than 'b
(written 'a: 'b
) then values of type T<'a>
can be coerced to values of type T<'b>
. For some other types, the relationship is contravariant: if 'a: 'b
, then T<'b>
can be coerced to T<'a>
(an example of this is Fn(&'a T)
). Finally, some types are invariant, so no coercion can occur.
std::slice::Iter<'a, T>
is covariant in the lifetime 'a
. If 'a
is longer than 'b
, we can coerce to the shorter lifetime. That's exactly what's happening in your code. When we call self.iter().take(1).cloned()
, self.iter()
is actually coerced to a shorter std::slice::Iter<'b, i8>
so that the mutable borrow can happen later.
fn mutate(&'a mut self) {
let mut vector: Vec<i8> = vec![];
// let iter = self.iter(); // works
let mut iter: std::slice::Iter<'a, i8> = self.iter(); // doesn't work!
for value in iter.take(1).cloned() {
vector.push(value);
}
for value in self.iter_mut() {
*value = vector[0];
}
}
Using the code above, we get an error similar to your trait-based code.
error[E0502]: cannot borrow `*self` as mutable because it is also borrowed as immutable
--> src/main.rs:27:22
|
11 | impl<'a> View<'a> {
| -- lifetime `'a` defined here
...
23 | let iter: std::slice::Iter<'a, i8> = self.iter(); // doesn't work!
| ------------------------ ---- immutable borrow occurs here
| |
| type annotation requires that `*self` is borrowed for `'a`
...
27 | for value in self.iter_mut() {
| ^^^^^^^^^^^^^^^ mutable borrow occurs here
Incidently, std::slice::IterMut<'a, T>
is invariant in its lifetime. This is because mutable references in general have to be invariant in order to be sound. This means that if you swapped the order of the mutable and immutable borrows, you'd get an error even in the non-trait version.
fn mutate(&'a mut self) {
let mut vector: Vec<i8> = vec![];
for value in self.iter_mut() {
// This would panic if it compiled, of course
*value = vector[0];
}
for value in self.iter().take(1).cloned() {
vector.push(value);
}
}
So the trait-based version doesn't work because self.iter()
requires the borrow to last too long and it can't be coerced to a shorter borrow. In fact, with how things are written, it might not even make sense to have a shorter borrow. Self::Iter
might only be defined for that one particular lifetime.
So what's the ideal way to write this? One way is to put the implementation of mutate
in each implementation of AbstractView
. When using the concrete types Iter
and IterMut
, the compiler knows that we can use covariance to make the lifetime shorter.
A more principled solution would be to make Self::Iter
and Self::IterMut
generic in their lifetimes so that the borrows can be shortened as needed. Generic associated types like this aren't possible yet.
On the nightly compiler, it's possible to do this, though as the compiler rightly warns, generic associated types are not yet finished and may cause compiler crashes or bugs.
#![feature(generic_associated_types)]
#[derive(Debug)]
struct Foo {
bar: Vec<i8>,
}
trait AbstractView {
type Iterator<'b>: Iterator<Item = &'b i8>;
type IteratorMut<'b>: Iterator<Item = &'b mut i8>;
// Eventually, these lifetimes should be elided
// But it doesn't seem that that's implemented yet
fn iter<'a>(&'a self) -> Self::Iterator<'a>;
fn iter_mut<'a>(&'a mut self) -> Self::IteratorMut<'a>;
fn mutate(&mut self) {
let mut vector: Vec<i8> = vec![];
for value in self.iter().take(1).cloned() {
vector.push(value);
}
for value in self.iter_mut() {
*value = vector[0];
}
}
}
#[derive(Debug)]
struct View<'a> {
foo: &'a mut Foo,
}
impl<'a> AbstractView for View<'a> {
type Iterator<'b> = std::slice::Iter<'b, i8>;
type IteratorMut<'b> = std::slice::IterMut<'b, i8>;
fn iter<'b>(&'b self) -> Self::Iterator<'b> {
self.foo.bar.iter()
}
fn iter_mut<'b>(&'b mut self) -> Self::IteratorMut<'b> {
self.foo.bar.iter_mut()
}
}
fn main() {
let mut foo = Foo { bar: vec![0, 1, 2] };
let mut view = View { foo: &mut foo };
view.mutate();
println!("{:?}", foo);
}
Upvotes: 1