Joji
Joji

Reputation: 5625

TypeScript: why is the return type determined by the conditional types doesn't work like the function overload equivalent?

I have a function that either accepts one string or two booleans. When it gets a string, it returns a string, when it receives two booleans, it returns a boolean. Here is my implementation using function overloads

function foo(a: string): string
function foo(a: boolean, b: boolean): boolean
function foo(a: string | boolean, b?: boolean): string | boolean {
    return a
}
const a = foo(true, true) // boolean
const ad = foo('sf') // string

However if I rewrite this to using conditional types

function foo2<B extends boolean | undefined,A extends (B extends boolean ? boolean : string)>(a: A, b?: B) : B extends boolean ? boolean : string  {
    return a
}

const aa = foo2(true, true) // boolean
const aa2 = foo2('sdf') // ❌ string | boolean

The return type of the string version is off - it says string | boolean as opposed to string

Not sure what went wrong here. It seems like B extends boolean ? boolean : string is not getting the right type for B when b is not provided.

Upvotes: 1

Views: 475

Answers (1)

jcalz
jcalz

Reputation: 328398

The problem you're facing has little to do with conditional types. It's primarily that when a function parameter is optional and you decide not to pass an argument when you call the function, this missing argument does not serve as an inference site from which to infer generic type parameters. For example:

function bar<T extends true | undefined>(x?: T): void { }
bar(true) // bar<true>()
bar(undefined) // bar<undefined>()
bar(); // bar<true | undefined>();

Here x is an optional property of type T constrained to the union true | undefined. If you pass in an explicit value of type true, the compiler will infer true for T. If you pass in an explicit value of type undefined, the compiler will infer undefined for T. But when you don't pass in any value whatsoever, the compiler does not infer T from this at all. The implicit undefined does not participate in type inference. And so type inference of T fails, and the compiler falls back to its constraint, true | undefined.


One way this can be fixed is for you to specify a type parameter default for T. When inference fails, the compiler will fall back to the default instead of the constraint:

function bar<T extends true | undefined = undefined>(x?: T): void { }
bar(true) // bar<true>()
bar(undefined) // bar<undefined>()
bar(); // bar<undefined>()

Since T defaults to undefined, the situation for bar() changes so that T is undefined, while the other cases are unchanged.

For your example code, this would look like:

function foo<
    A extends (B extends boolean ? boolean : string),
    B extends boolean | undefined = undefined
>(a: A, b?: B): B extends boolean ? boolean : string {
    return a
}

const aa = foo(true, true) // boolean
// function foo<true, true>(a: true, b?: true | undefined): boolean
const aa2 = foo('sdf') // string
// function foo<"sdf", undefined>(a: "sdf", b?: undefined): string

as desired.


Another approach is to try to simulate overloads more closely by having your function take a rest parameter whose type is a union of tuple types, like this:

function foo<T extends [a: boolean, b: boolean] | [a: string]>(...args: T): T[0] {
    return args[0];
}

const aa = foo(true, true) // true
const aa2 = foo('sdf') // "sdf"

Playground link to code

Upvotes: 2

Related Questions