Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Relay][Op] Trilu operator implementation #12124

Merged
merged 7 commits into from
Aug 2, 2022
Merged

Commits on Aug 2, 2022

  1. Added topi trilu implementation

    jwfromm authored and Josh Fromm committed Aug 2, 2022
    Configuration menu
    Copy the full SHA
    89f0e52 View commit details
    Browse the repository at this point in the history
  2. Implemented and tested full Trilu op.

    jwfromm authored and Josh Fromm committed Aug 2, 2022
    Configuration menu
    Copy the full SHA
    3f78abf View commit details
    Browse the repository at this point in the history
  3. Fix test type.

    jwfromm authored and Josh Fromm committed Aug 2, 2022
    Configuration menu
    Copy the full SHA
    5f7f744 View commit details
    Browse the repository at this point in the history
  4. Add tril zero tests.

    Josh Fromm committed Aug 2, 2022
    Configuration menu
    Copy the full SHA
    0721e22 View commit details
    Browse the repository at this point in the history
  5. Add pytorch trilu integration.

    Josh Fromm committed Aug 2, 2022
    Configuration menu
    Copy the full SHA
    aa46851 View commit details
    Browse the repository at this point in the history
  6. Clean up torch integration.

    Josh Fromm committed Aug 2, 2022
    Configuration menu
    Copy the full SHA
    0585271 View commit details
    Browse the repository at this point in the history
  7. Readded skip for zero tests.

    Josh Fromm committed Aug 2, 2022
    Configuration menu
    Copy the full SHA
    47366cf View commit details
    Browse the repository at this point in the history