Skip to content

Decouple custom ops in llama_transformer.py Part 1/N #12709

Decouple custom ops in llama_transformer.py Part 1/N

Decouple custom ops in llama_transformer.py Part 1/N #12709

Annotations

1 warning

test-models-linux (cmake, llava_encoder, portable, linux.4xlarge, 90)  /  linux-job

succeeded Apr 12, 2024 in 27m 46s