Skip to content
This repository has been archived by the owner on Oct 25, 2024. It is now read-only.

Commit

Permalink
fix import issue and update pip install itrex (#397)
Browse files Browse the repository at this point in the history
Signed-off-by: lvliang-intel <liang1.lv@intel.com>
  • Loading branch information
lvliang-intel authored Sep 26, 2023
1 parent 6eceb08 commit 7a85baf
Show file tree
Hide file tree
Showing 17 changed files with 127 additions and 86 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,8 @@
},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,7 @@
"metadata": {},
"outputs": [],
"source": [
"%%bash\n",
"%pip install intel-extension-for-transformers"
"!pip install intel-extension-for-transformers"
]
},
{
Expand All @@ -37,7 +36,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,29 @@
},
"outputs": [],
"source": [
"pip install intel-extension-for-transformers"
"!pip install intel-extension-for-transformers"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Install requirements:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "shellscript"
}
},
"outputs": [],
"source": [
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
{
Expand All @@ -47,34 +69,34 @@
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!git lfs install\n",
"!git clone https://huggingface.co/decapoda-research/llama-7b-hf"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To avoid unexpexted confliction issues, we advise the user to modify the local `config.json` and `tokenizer_config.json` files according to the following recommendations:\n",
"\n",
"1. The `tokenizer_class` in `tokenizer_config.json` should be changed from `LLaMATokenizer` to `LlamaTokenizer`;\n",
"2. The `architectures` in `config.json` should be changed from `LLaMAForCausalLM` to `LlamaForCausalLM`"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!curl -OL https://raw.githubusercontent.com/tatsu-lab/stanford_alpaca/main/alpaca_data.json"
]
},
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!git lfs install\n",
"!git clone https://huggingface.co/decapoda-research/llama-7b-hf"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To avoid unexpexted confliction issues, we advise the user to modify the local `config.json` and `tokenizer_config.json` files according to the following recommendations:\n",
"\n",
"1. The `tokenizer_class` in `tokenizer_config.json` should be changed from `LLaMATokenizer` to `LlamaTokenizer`;\n",
"2. The `architectures` in `config.json` should be changed from `LLaMAForCausalLM` to `LlamaForCausalLM`"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!curl -OL https://raw.githubusercontent.com/tatsu-lab/stanford_alpaca/main/alpaca_data.json"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,29 @@
},
"outputs": [],
"source": [
"pip install intel-extension-for-transformers"
"!pip install intel-extension-for-transformers"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Install requirements:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "shellscript"
}
},
"outputs": [],
"source": [
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,25 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install intel-extension-for-transformers"
"!pip install intel-extension-for-transformers"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Install requirements:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,10 @@
"metadata": {},
"outputs": [],
"source": [
"!git clone https://github.com/intel/intel-extension-for-transformers.git && cd ./intel-extension-for-transformers/\n",
"!pip install -r requirements.txt && pip install -v .\n",
"!cd ./intel_extension_for_transformers/neural_chat/ && pip install -r requirements.txt"
"!pip install intel-extension-for-transformers\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,10 @@
"metadata": {},
"outputs": [],
"source": [
"!git clone https://github.com/intel/intel-extension-for-transformers.git && cd ./intel-extension-for-transformers/ && pip install -r requirements.txt\n",
"!pip install -v .\n",
"!cd ./intel_extension_for_transformers/neural_chat/ && pip install -r requirements.txt\n",
"!pip install intel-extension-for-transformers\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt\n",
"!sudo apt install numactl\n",
"!conda install astunparse ninja pyyaml mkl mkl-include setuptools cmake cffi typing_extensions future six requests dataclasses -y\n",
"!conda install jemalloc gperftools -c conda-forge -y"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,10 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install intel-extension-for-transformers"
"!pip install intel-extension-for-transformers\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,8 @@
"metadata": {},
"outputs": [],
"source": [
"%cd ../../\n",
"!git clone https://github.com/intel/intel-extension-for-transformers.git\n",
"!cd ./intel-extension-for-transformers/intel_extension_for_transformers/neural_chat/\n",
"!pip install -r requirements.txt"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

from .base_model import BaseModel, register_model_adapter
import logging
from fastchat.conversation import get_conv_template, Conversation, register_conv_template, SeparatorStyle
from fastchat.conversation import get_conv_template, Conversation

logging.basicConfig(
format="%(asctime)s - %(levelname)s - %(name)s - %(message)s",
Expand All @@ -26,41 +26,6 @@
)
logger = logging.getLogger(__name__)

# neuralchat-v2 prompt template
register_conv_template(
Conversation(
name="neural-chat-7b-v2",
system_message="""### System:
- You are a helpful assistant chatbot trained by Intel.
- You answer questions.
- You are excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- You are more than just an information source, you are also able to write poetry, \
short stories, and make jokes.</s>\n""",
roles=("### User:", "### Assistant:"),
sep_style=SeparatorStyle.NO_COLON_TWO,
sep="\n",
sep2="</s>",
)
)

# neuralchat-v1.1 prompt template
register_conv_template(
Conversation(
name="neural-chat-7b-v1.1",
system_template="""<|im_start|>system
{system_message}""",
system_message="""- You are a helpful assistant chatbot trained by Intel.
- You answer questions.
- You are excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- You are more than just an information source, you are also able to write poetry, short stories, and make jokes.""",
roles=("<|im_start|>user", "<|im_start|>assistant"),
sep_style=SeparatorStyle.CHATML,
sep="<|im_end|>",
stop_token_ids=[50278, 0],
)
)


class NeuralChatModel(BaseModel):
def match(self, model_path: str):
"""
Expand Down

0 comments on commit 7a85baf

Please sign in to comment.