Phi-3 mini model released under MIT! 🚀 Last Week Llama 3, this week Phi-3 🤯 Microsoft Phi-3 comes in 3 different sizes: mini (3.8B), small (7B) & medium (14B). Phi-3-mini was released , claiming to match Llama 3 8B performance! 🚀 What is Phi-3 Mini? Phi-3 Mini is a lightweight language model with 3.8 billion parameters. It belongs to the Phi-3 family and comes in two variants: 4K and 128K context lengths. Training and Fine-Tuning: Trained on a massive 3.3 trillion tokens. Fine-tuned using supervised fine-tuning ( SFT ) and direct preference optimization (DPO). Performance Metrics: Achieves impressive performance: 68.8% on MMLU (common sense, language understanding, mathematics, coding, etc.) 8.38 on MT Bench (long-term context and logical reasoning). Outperforms models like Mistral 7B and Llama 3 8B Instruct. Availability and Platforms: Available on Hugging Face and in Hugging Chat. Runs on Android and iPhones. Use Cases: Ideal for memory/compute-constrained environments. Stro...