passing_through
passing_through

Reputation: 1941

Why can't the lifetime of a reference be deduced as the shortest of all the contextual possibilities?

Here's a code sample where I test two &strs and return one of them:

fn bad_longest(s1: &str, s2: &str) -> &str {
  if s1.len() >= s2.len() { s1 } else { s2 }
}

It didn't compile requesting explicit lifetimes so I provided them:

fn longest<'r, 'a, 'b>(s1: &'a str, s2: &'b str) -> &'r str
where
  'a: 'r,
  'b: 'r
{
  if s1.len() >= s2.len() { s1 } else { s2 }
}

Now, the following test passes with no problems:

static STATIC: &str = "123";

fn main() {
  let auto = "123456";
  let dyn_ = String::from("123456789");
  println!(
    "{}",
    longest(
      longest(STATIC, auto),
      dyn_.as_str()
    )
  );
}

Here's my question: aren't the lifetimes which I manually provided obviously deducible from the context? Am I missing any use cases?

Upvotes: 0

Views: 107

Answers (1)

Peter Hall
Peter Hall

Reputation: 58805

Elided lifetimes in a function signature are never deduced from how they are used. There are some simple rules for inferring elided lifetimes, which are based entirely on the signature itself:

  • Each elided lifetime in the parameters becomes a distinct lifetime parameter.
  • If there is exactly one lifetime used in the parameters (elided or not), that lifetime is assigned to all elided output lifetimes.

In method signatures there is another rule

  • If the receiver has type &Self or &mut Self, then the lifetime of that reference to Self is assigned to all elided output lifetime parameters.

Your function has two non-self arguments, so none of these rules can give a lifetime for the return value, so explicit lifetimes are required.

Upvotes: 2

Related Questions