Microsoft, the world's largest technology company, and Mistral AI, a Paris-based Generative Artificial intelligence (AI) startup, have teamed to enable Mistral AI's AI models to operate on Microsoft's Azure cloud computing platform.
The less than a year-old company is causing a stir and is regarded as Europe's answer to AI platforms like Google's Gemini (formerly known as Bard) and OpenAI's ChatGPT.
As part of the agreement, Azure clients will get early access to Mistral's newest model, Mistral Large. A number of Mistral's AI Models will be available to Azure users, and the company will host its technology on Microsoft's cloud computing network.
According to a Reuters story, Microsoft plans to purchase a minority position in Mistral AI. It's crucial to remember that Microsoft has already invested more than $10 billion in Mistral's competitor, OpenAI.
"Under the terms of the agreement, Mistral AI will further test, develop, and scale up its LLMs (large language models) while taking advantage of Google Cloud's security and privacy standards. Mistral AI will use Google Cloud's AI-optimized infrastructure, including TPU Accelerators."
In order to disseminate its most recent model, the Mistral Large, Mistral has also been collaborating with Amazon and other cloud platforms.
In a post on X, Mistral subtly revealed the Mixtral 8x7B model in December, claiming it outperforms Meta's Llama 2 and Open AI's GPT 3.5 large language models (LLMs) on several performance metrics.
When it comes to cost/performance trade-offs, it is the best model overall and the strongest open-weight model with a permissive license. Specifically, it meets or surpasses GPT3.5 on the majority of common benchmarks, according to a blog post by Mistral AI.
The firm, called "Mistral 7B," debuted its first big language model in September and has since become one of the fastest-growing unicorns in Europe.
The Mixtral 8x7B model not only beat many benchmarks on performance metrics, but it also says it uses a far more economical technique that makes pre-training models much less computationally demanding.
Despite having 46.7 billion parameters in total, the model only employs 12.9 billion parameters for each token. Thus, according to the business, "it processes input and generates output at the same speed and for the same cost as a 12.9B model."
For businesses who are currently paying astronomical expenses for data storage needs as they scale their operations, this can drastically lower deployment costs.
In its second fundraising round in seven months, Mistral received $415 million in December, with investors including Andreessen- Horowitz and LightSpeed Ventures leading the charge. Moreover, Salesforce and Nvidia have committed to contribute around $130 million in convertible debt to the financing.
The French business was included among the top AI startups in the world, and although the company would not disclose its worth, a Reuters story estimated it to be around €2 billion.
Salesforce, BNP Paribas, Eric Schmidt, La Famiglia, Motier Ventures, Sofina, and New Wave are a few of its backers.
CEO Arthur Mensch, together with Guillaume Lample and Timothee Lacroix, who formerly worked at Meta and Google's DeepMind, are in charge of the French firm.