An anonymous reader quotes a report from : Microsoft Research, the blue sky division of the software giant, […] announced the release of its Phi-2 small language model (SML), a text-to-text AI program that is “small enough to run on a laptop or mobile device,” according to a post on X. At the same time, Phi-2 with its 2.7 billion parameters (connections between artificial neurons) boasts performance that is comparable to other, much larger models including Meta’s Llama 2-7B with its 7 billion parameters and even Mistral-7B, another 7 billion parameter model.
Microsoft researchers also noted in their blog post on the Phi-2 release that it outperforms Google’s brand new Gemini Nano 2 model despite it having half a billion more parameters, and delivers less “toxicity” and bias in its responses than Llama 2. Microsoft also couldn’t resist taking a little dig at Google’s now much-criticized, staged demo video for