Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(T | undefined) & T has different behavior when T is known versus when T is a type variable #46976

Closed
TheUnlocked opened this issue Dec 1, 2021 · 1 comment Β· Fixed by #47010
Assignees
Labels
Bug A bug in TypeScript Fix Available A PR has been opened for this issue

Comments

@TheUnlocked
Copy link

TheUnlocked commented Dec 1, 2021

Bug Report

πŸ”Ž Search Terms

  • or undefined
  • and undefined
  • undefined generic disjunction
  • generic or undefined

πŸ•— Version & Regression Information

  • This is the behavior in every version I tried

⏯ Playground Link

Playground link with relevant code

πŸ’» Code

// Samples of behavior on known types
// The evaluated type is always equal to the input type
type Test1 = (1 | undefined) & 1; // 1
type Test2 = (string | undefined) & string; // string
type Test3 = ({ a: 1 } | undefined) & { a: 1 }; // { a: 1 } & { a: 1 }
type Test4 = (unknown | undefined) & unknown; // unknown
type Test5 = (never | undefined) & never; // never

// Minimal example of issue
function f<T>() {
    type Y = (T | undefined) & T; // (T | undefined & T)
}

// Minimal example of when this could cause a problem
function g1<T extends {}, A extends { z: (T | undefined) & T }>(a: A) {
    const { z } = a;

    return {
        // T must extend {} so it should be spreadable
        // ERROR: Spread types may only be created from object types. (2698)
        ...z
    };
}

// If the type of z is just T instead of (T | undefined) & T, it works fine
function g2<T extends {}, A extends { z: T }>(a: A) {
    const { z } = a;

    return {
        ...z
    };
}

πŸ™ Actual behavior

The type expression (T | undefined) & T evaluates to itself when T is a type variable.

πŸ™‚ Expected behavior

Either (T | undefined) & T should evaluate to T when T is a type variable since that's the behavior with known types, or (T | undefined) & T should not evaluate to T when T is a known type (and should instead evaluate to something else in both cases-- never maybe?).

@ahejlsberg
Copy link
Member

The type (T | undefined) & T normalizes to T | T & undefined which, upon subtype reduction, further reduces to just T. However, because it is expensive, we don't always perform subtype reduction. Likewise, in cases where you manually write it out, we don't go through the extra effort of reducing. So, it is possible and expected to see types like T | T & undefined.

The real issue here is that we don't consider T | T & undefined to be a spreadable type. Given that T is spreadable and that it is perfectly valid to spread an undefined (which does nothing), it should definitely also be ok to spread a T & undefined. That's what needs to be fixed.

@ahejlsberg ahejlsberg added the Bug A bug in TypeScript label Dec 2, 2021
@ahejlsberg ahejlsberg self-assigned this Dec 2, 2021
@ahejlsberg ahejlsberg added this to the TypeScript 4.6.0 milestone Dec 2, 2021
@typescript-bot typescript-bot added the Fix Available A PR has been opened for this issue label Dec 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug A bug in TypeScript Fix Available A PR has been opened for this issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants