Fine-tuning TinyLlama Used LoRA to prompt-engineer TinyLlama for giving hex color codes from color description. Demo Run model inference Note: you need GPU to run this or you can use Google Colab Head to Fine-tuned model inference section in the python notebook for model inference