The video shows how to fine-tune Mixtral

Mistral’s 8x7B Mixture of Experts (MoE) outperforms the Llama2 70B!

This video walkthrough is simple to follow and uses QLoRA, so you don’t need an A100

YouTube link is located 🤙 below (from @HarperSCarroll)

Scroll to Top