Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add magpie support llama cpp ollama #1086

Open
wants to merge 11 commits into
base: develop
Choose a base branch
from

Conversation

davidberenstein1957
Copy link
Member

I added extended support for Ollama and Llamacpp.

  • ollama support
  • llamacpp support
  • minor refactors w.r.t. import of disitlabel.models module.

I looked into adding OpenAI API format for other providers but this does not work because tokenization is handled server-side and cannot be disabled out of the box.

Perhaps we can refactor the HF InferenceClient a bit to make this work.

Ollama

from distilabel.models import LlamaCppLLM
from distilabel.steps.tasks import Magpie

llm = LlamaCppLLM(
    model_path="smollm2-360m-instruct-q8_0.gguf",
    tokenizer_id="HuggingFaceTB/SmolLM2-360M-Instruct",
    magpie_pre_query_template="qwen2",
)
magpie = Magpie(
    llm=llm,
)
magpie.load()

print(next(magpie.process(inputs=[{"system": "You are a helpful assistant."}])))

Llamacpp

from distilabel.models import OllamaLLM
from distilabel.steps.tasks import Magpie

llm = OllamaLLM(
    model="llama3.1",
    tokenizer_id="meta-llama/Meta-Llama-3-8B-Instruct",
    magpie_pre_query_template="llama3",
)
magpie = Magpie(llm=llm)
magpie.load()

print(next(magpie.process(inputs=[{"system_prompt": "You are a helpful assistant."}])))

Copy link

codspeed-hq bot commented Dec 23, 2024

CodSpeed Performance Report

Merging #1086 will not alter performance

Comparing feat/add-magpie-support-llama-cpp-ollama (4e291e7) with develop (f1f7d77)

Summary

✅ 1 untouched benchmarks

@davidberenstein1957 davidberenstein1957 marked this pull request as ready for review December 23, 2024 16:01
@davidberenstein1957 davidberenstein1957 changed the title Feat/add magpie support llama cpp ollama Add magpie support llama cpp ollama Dec 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant