Skip to content

Commit

Permalink
Update FaqGen README.md for its workflow (opea-project#910)
Browse files Browse the repository at this point in the history
Signed-off-by: Tsai, Louie <louie.tsai@intel.com>
  • Loading branch information
louie-tsai authored and JoshuaL3000 committed Oct 16, 2024
1 parent ad066e6 commit 5cb1b20
Showing 1 changed file with 49 additions and 0 deletions.
49 changes: 49 additions & 0 deletions FaqGen/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,55 @@ In today's data-driven world, organizations across various industries face the c

Our FAQ Generation Application leverages the power of large language models (LLMs) to revolutionize the way you interact with and comprehend complex textual data. By harnessing cutting-edge natural language processing techniques, our application can automatically generate comprehensive and natural-sounding frequently asked questions (FAQs) from your documents, legal texts, customer queries, and other sources. In this example use case, we utilize LangChain to implement FAQ Generation and facilitate LLM inference using Text Generation Inference on Intel Xeon and Gaudi2 processors.

The FaqGen example is implemented using the component-level microservices defined in [GenAIComps](https://github.com/opea-project/GenAIComps). The flow chart below shows the information flow between different microservices for this example.

```mermaid
---
config:
flowchart:
nodeSpacing: 400
rankSpacing: 100
curve: linear
themeVariables:
fontSize: 50px
---
flowchart LR
%% Colors %%
classDef blue fill:#ADD8E6,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef orange fill:#FBAA60,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef orchid fill:#C26DBC,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5
classDef invisible fill:transparent,stroke:transparent;
style FaqGen-MegaService stroke:#000000
%% Subgraphs %%
subgraph FaqGen-MegaService["FaqGen MegaService "]
direction LR
LLM([LLM MicroService]):::blue
end
subgraph UserInterface[" User Interface "]
direction LR
a([User Input Query]):::orchid
UI([UI server<br>]):::orchid
end
LLM_gen{{LLM Service <br>}}
GW([FaqGen GateWay<br>]):::orange
%% Questions interaction
direction LR
a[User Input Query] --> UI
UI --> GW
GW <==> FaqGen-MegaService
%% Embedding service flow
direction LR
LLM <-.-> LLM_gen
```

## Deploy FAQ Generation Service

The FAQ Generation service can be deployed on either Intel Gaudi2 or Intel Xeon Scalable Processors.
Expand Down

0 comments on commit 5cb1b20

Please sign in to comment.