Skip to content

Commit

Permalink
🎨 prettier formatting fixes for chatqna readme
Browse files Browse the repository at this point in the history
Signed-off-by: Krishna Murti <[email protected]>
  • Loading branch information
krish918 committed Oct 9, 2024
1 parent a8b85d7 commit 294f1a0
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion helm-charts/chatqna/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ For LLM inference, two more microservices will be required. We can either use [T
- [llm-ctrl-uservice](../common/llm-ctrl-uservice/README.md)
- [vllm](../common/vllm/README.md)

> **__Note :__** We shouldn't have both inference engine in our setup. We have to setup either of them. For this, conditional flags are added in the chart dependency. We will be switching off flag corresponding to one service and switching on the other, in order to have a proper setup of all ChatQnA dependencies.
> **Note:** We shouldn't have both inference engine in our setup. We have to setup either of them. For this, conditional flags are added in the chart dependency. We will be switching off flag corresponding to one service and switching on the other, in order to have a proper setup of all ChatQnA dependencies.
## Installing the Chart

Expand Down

0 comments on commit 294f1a0

Please sign in to comment.