Skip to content

Conversation

NovTi
Copy link
Contributor

@NovTi NovTi commented Dec 7, 2023

Update Chapter 5 5_1_ChatBot and 5_1_2_Speech Recognition notebook in the English version and Chinese version

"\n",
"model_path = snapshot_download(repo_id='meta-llama/Llama-2-7b-chat-hf',\n",
" token='hf_XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX') # change it to your own Hugging Face access token"
" token='hf_XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX') # change it to your own Hugging Face access token\n"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

still need a blank space to align

"### 5.1.2.1 Load Model in Low Precision\n",
"\n",
"One common use case is to load a Hugging Face *transformers* model in low precision, i.e. conduct **implicit** quantization while loading.\n",
" One common use case is to load a Hugging Face *transformers* model in low precision, i.e. conduct **implicit** quantization while loading.\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we remove 5.1.2.1 Load Model in Low Precision section @shane-huang ? If remove, following section also need to modify.

"from bigdl.llm.transformers import AutoModelForCausalLM\n",
"\n",
"model_in_4bit = AutoModelForCausalLM.from_pretrained(pretrained_model_name_or_path=\"meta-llama/Llama-2-7b-chat-hf\",\n",
"model_in_4bit = AutoModelForCausalLM.from_pretrained(pretrained_model_name_or_path=\"../chat-7b-hf/\",\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should still use "meta-llama/Llama-2-7b-chat-hf" as a common usage.

"from transformers import LlamaTokenizer\n",
"\n",
"tokenizer = LlamaTokenizer.from_pretrained(pretrained_model_name_or_path=\"meta-llama/Llama-2-7b-chat-hf\")"
"tokenizer = LlamaTokenizer.from_pretrained(pretrained_model_name_or_path=\"../chat-7b-hf/\")"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

similarly, I think we should still use "meta-llama/Llama-2-7b-chat-hf"

"metadata": {},
"outputs": [
{
"name": "stdout",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we don't need such output here. Maybe we could clear the output.

],
"source": [
"SYSTEM_PROMPT = \"You are a helpful, respectful and honest assistant, who always answers as helpfully as possible, while being safe.\"\n",
"SYSTEM_PROMPT = \"You are a helpful, respectful and honest assistant.\"\n",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add the two code blocks here seems a little strange.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants