Skip to content
This repository has been archived by the owner on Oct 25, 2024. It is now read-only.

Commit

Permalink
update system libs (#1273)
Browse files Browse the repository at this point in the history
* update system libs
* update mainpage

Signed-off-by: Wenxin Zhang <wenxin.zhang@intel.com>
  • Loading branch information
VincyZhang authored Feb 15, 2024
1 parent 033b7d5 commit f565baa
Show file tree
Hide file tree
Showing 2 changed files with 39 additions and 13 deletions.
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ Intel® Extension for Transformers is an innovative toolkit designed to accelera

> In the table above, "-" means not applicable or not started yet.
## Validated Software
## 🔓Validated Software
<table>
<tbody>
<tr>
Expand Down Expand Up @@ -163,6 +163,9 @@ Intel® Extension for Transformers is an innovative toolkit designed to accelera

> Please refer to the detailed requirements in [CPU](intel_extension_for_transformers/neural_chat/requirements_cpu.txt), [Gaudi2](intel_extension_for_transformers/neural_chat/requirements_hpu.txt), [Intel GPU](https://github.com/intel/intel-extension-for-transformers/blob/main/requirements-gpu.txt).
## 🔓Validated OS
Ubuntu 20.04/22.04, Centos 8.

## 🌱Getting Started

### Chatbot
Expand Down
47 changes: 35 additions & 12 deletions docs/installation.md
Original file line number Diff line number Diff line change
@@ -1,21 +1,45 @@
# Installation

1. [Install from Pypi](#install-from-pypi)
1. [System Requirement](#system-requirements)

2. [Install from Source](#install-from-source)
2. [Install from Pypi](#install-from-pypi)

3. [Install from Source](#install-from-source)

2.1. [Prerequisites](#prerequisites)

2.2. [Install Intel Extension for Transformers](#install-intel-extension-for-transformers)

3. [System Requirements](#system-requirements)
4. [Validated Environment](#validated-environment)

3.1. [Validated Hardware Environment](#validated-hardware-environment)

3.2. [Validated Software Environment](#validated-software-environment)


## System Requirements
For NeuralChat usage, please make sure if you have below system libraries installed:

### Ubuntu 20.04/22.04
```bash
apt-get update
apt-get install -y ffmpeg
apt-get install -y libgl1-mesa-glx libgl1-mesa-dev
apt-get install -y libsm6 libxext6
```

### Centos 8
```bash
yum update -y
yum install -y mesa-libGL mesa-libGL-devel
yum install -y libXext libSM libXrender
```

For ffmpeg, please refer to [how-to-install-ffmpeg-on-centos-rhel-8](https://computingforgeeks.com/how-to-install-ffmpeg-on-centos-rhel-8/)


## Install from Pypi
Binary builds for python 3.8, 3.9 and 3.10 are available in Pypi
Binary builds for python 3.9, 3.10 and 3.11 are available in Pypi

>**Note**: Recommend install protobuf <= 3.20.0 if use onnxruntime <= 1.11
Expand All @@ -33,10 +57,8 @@ conda install -c intel intel_extension_for_transformers

### Prerequisites
The following prerequisites and requirements must be satisfied for a successful installation:
- Python version: 3.8 or 3.9 or 3.10
- GCC >= version 8 (on Linux)
- version 11, if use the bf16-related features of the itrex backend
- version 13, if use the fp16-related features of the itrex backend
- Python version: 3.9 or 3.10 or 3.11
- GCC >= version 10 (on Linux)
- Visual Studio (on Windows)

### Install Intel Extension for Transformers
Expand All @@ -48,7 +70,8 @@ pip install -r requirements.txt
pip install -v .
```

## System Requirements
## Validated Environment

### Validated Hardware Environment
Intel® Extension for Transformers supports the following HWs:

Expand All @@ -60,7 +83,7 @@ Intel® Extension for Transformers supports the following HWs:
### Validated Software Environment

* OS version: CentOS 8.4, Ubuntu 20.04
* Python version: 3.8, 3.9, 3.10
* Python version: 3.9, 3.10, 3.11

<table class="docutils">
<thead>
Expand All @@ -85,7 +108,7 @@ Intel® Extension for Transformers supports the following HWs:
</table>

* OS version: Windows 10
* Python version: 3.8, 3.9, 3.10
* Python version: 3.9, 3.10, 3.11

<table class="docutils">
<thead>
Expand All @@ -99,7 +122,7 @@ Intel® Extension for Transformers supports the following HWs:
<tr align="center">
<th>Version</th>
<td><a href=https://github.com/Intel-tensorflow/tensorflow/tree/v2.13.0>2.13.0</a><br>
<td><a href=https://download.pytorch.org/whl/torch_stable.html>2.0.0+cpu</a><br>
<td><a href=https://download.pytorch.org/whl/torch_stable.html>2.1.0+cpu</a><br>
</tr>
</tbody>
</table>

0 comments on commit f565baa

Please sign in to comment.