From 8c6881e7ad19cbbaef5714914b28b2791b1c4f4c Mon Sep 17 00:00:00 2001 From: Louie Tsai Date: Mon, 7 Oct 2024 22:37:52 -0700 Subject: [PATCH] Update FaqGen README.md for its workflow Signed-off-by: Tsai, Louie --- FaqGen/README.md | 49 ++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 49 insertions(+) diff --git a/FaqGen/README.md b/FaqGen/README.md index 8ea498e54..06bb7c3d8 100644 --- a/FaqGen/README.md +++ b/FaqGen/README.md @@ -4,6 +4,55 @@ In today's data-driven world, organizations across various industries face the c Our FAQ Generation Application leverages the power of large language models (LLMs) to revolutionize the way you interact with and comprehend complex textual data. By harnessing cutting-edge natural language processing techniques, our application can automatically generate comprehensive and natural-sounding frequently asked questions (FAQs) from your documents, legal texts, customer queries, and other sources. In this example use case, we utilize LangChain to implement FAQ Generation and facilitate LLM inference using Text Generation Inference on Intel Xeon and Gaudi2 processors. +The FaqGen example is implemented using the component-level microservices defined in [GenAIComps](https://github.com/opea-project/GenAIComps). The flow chart below shows the information flow between different microservices for this example. + +```mermaid +--- +config: + flowchart: + nodeSpacing: 400 + rankSpacing: 100 + curve: linear + themeVariables: + fontSize: 50px +--- +flowchart LR + %% Colors %% + classDef blue fill:#ADD8E6,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5 + classDef orange fill:#FBAA60,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5 + classDef orchid fill:#C26DBC,stroke:#ADD8E6,stroke-width:2px,fill-opacity:0.5 + classDef invisible fill:transparent,stroke:transparent; + style FaqGen-MegaService stroke:#000000 + + %% Subgraphs %% + subgraph FaqGen-MegaService["FaqGen MegaService "] + direction LR + LLM([LLM MicroService]):::blue + end + subgraph UserInterface[" User Interface "] + direction LR + a([User Input Query]):::orchid + UI([UI server
]):::orchid + end + + + LLM_gen{{LLM Service
}} + GW([FaqGen GateWay
]):::orange + + + %% Questions interaction + direction LR + a[User Input Query] --> UI + UI --> GW + GW <==> FaqGen-MegaService + + + %% Embedding service flow + direction LR + LLM <-.-> LLM_gen + +``` + ## Deploy FAQ Generation Service The FAQ Generation service can be deployed on either Intel Gaudi2 or Intel Xeon Scalable Processors.