-
Notifications
You must be signed in to change notification settings - Fork 14
Mistral
sagatake edited this page Sep 25, 2024
·
5 revisions
The Mistral module can be used to dialog with Mistral through Greta
- make sure you have conda and python installed (usually you can install together with anaconda https://www.anaconda.com/)
_ Warning : This requires an 6GB GPU_
-Install LmStudio : https://lmstudio.ai/ -Download model TheBloke/Mistral-7B-Instruct-v0.2-GGUF -Run local server with the following setting in LM Studio 👍 -port 1234
Warning : This is not free except for the two weeks where 5€ test are offered.
- Create an API key in https://console.mistral.ai/billing/
- In Common/Data/Mistral
- In Mistral.py replace “my_apy_key” by the API key created above
- Add a Mistral Module to your Configuration
- Add a link between the Mistral module and the Behavior Planner
- Set up Port and Address (can stay the same)
- Check Enable
- Choose language and model (local or online)
- (Optional) Enter a system prompt if you want your agent to behave a certain way Ex : You are a museum guide who makes a lot of jokes. You always answer in rhymes
- Enter your request in the request panel
- Click Send
You can use a different model by modifying the model name in Mistral.py
An example of demo integration of this module is available at LLM DeepASR integration
Advanced
- Generating New Facial expressions
- Generating New Gestures
- Generating new Hand configurations
- Torso Editor Interface
- Creating an Instance for Interaction
- Create a new virtual character
- Creating a Greta Module in Java
- Modular Application
- Basic Configuration
- Signal
- Feedbacks
- From text to FML
- Expressivity Parameters
- Text-to-speech, TTS
-
AUs from external sources
-
Large language model (LLM)
-
Automatic speech recognition (ASR)
-
Extentions
-
Integration examples
Nothing to show here