What is Orca 2? Microsoft’s latest version could outperform smaller models and compete with larger models

What is Orca 2? Microsoft’s latest version could outperform smaller models and compete with larger models

The AI ​​race is only heating up. Besides the OpenAI leadership shake-up involving the (now former) CEO Sam Altman, the company’s board of directors, and Microsoft, the Redmond-based technology giant, “quietly” launched its latest small language models. It’s called Orca 2 and, at first glance, it could be Microsoft’s answer to the growing AI challenge.

Orca 2 doesn’t just talk, it walks. Outperforming similarly sized models and taking on models almost ten times larger, especially in tricky tasks that test advanced reasoning, Orca 2 proves its worth.

Available in two sizes, 7 billion and 13 billion parameters, both fine-tuned on special synthetic data, Microsoft says it creates weights for the public to “encourage research” into smaller language models.

Check out the charts below to see how Orca 2 performs on a variety of criteria compared to other similarly sized models and even models 5-10 times larger.

“The training data was generated in such a way that it teaches Orca 2 various reasoning techniques, such as step-by-step processing, recall-then-generate, recall-reason-generate, extract-generate and direct response methods, while teaching him to choose. different solution strategies for different tasks,” Microsoft says in the official announcement.

A few months ago, Redmond researchers launched their predecessor, Orca 1, with 13 billion parameters. You can read the one from Microsoft Orca 2 paper here.

You May Also Like

+ There are no comments

Add yours