Snowflake Partners with Mistral AI to Bring Language Models to Enterprises Through Snowflake Cortex
Snowflake, the Data Cloud company, and Mistral AI, a European provider of AI solutions, has announced a global partnership. This collaboration aims to bring Mistral AI’s most powerful language models directly to Snowflake customers in the Data Cloud.
The partnership, spanning multiple years and including a parallel investment in Mistral’s Series A from Snowflake Ventures, seeks to equip enterprises with the capabilities necessary to harness the potential of large language models (LLMs) while upholding security, privacy, and governance over their data assets.
With the new partnership, Snowflake customers will gain access to Mistral AI’s latest and most robust LLM, Mistral Large, which boasts benchmarks placing it among the world's top-performing models. Beyond its benchmark performance, Mistral AI’s flagship model exhibits unique reasoning capabilities, proficiency in code and mathematics, and fluency in five languages: French, English, German, Spanish, and Italian. This aligns with Mistral AI's commitment to promoting the cultural and linguistic specificities of generative AI technology. Furthermore, the model can process hundreds of pages of documents in a single call.
In addition to Mistral Large, Snowflake customers will also have access to Mixtral 8x7B, Mistral AI’s open-source model that surpasses OpenAI’s GPT3.5 in both speed and quality on most benchmarks. Alongside this, customers can utilise Mistral 7B, Mistral AI’s foundational model optimised for low latency, with low memory requirements and high throughput for its size.
These models are now available to customers in public preview as part of Snowflake Cortex, Snowflake's fully managed LLM and vector search service. This service empowers organisations to accelerate analytics and rapidly develop AI applications securely utilising their enterprise data.
“By partnering with Mistral AI, Snowflake is putting one of the most powerful LLMs on the market directly in the hands of our customers, empowering every user to build cutting-edge, AI-powered apps with simplicity and scale,” said Sridhar Ramaswamy, CEO of Snowflake. “With Snowflake as the trusted data foundation, we’re transforming how enterprises harness the power of LLMs through Snowflake Cortex so they can cost-effectively address new AI use cases within the security and privacy boundaries of the Data Cloud.”
“Snowflake’s commitments to security, privacy, and governance align with Mistral AI’s ambition to put frontier AI in everyone’s hand and to be accessible everywhere. Mistral AI shares Snowflake’s values for developing efficient, helpful, and trustworthy AI models that advance how organisations around the world tap into generative AI,” said Arthur Mensch, CEO and co-founder of Mistral AI. “With our models available in the Snowflake Data Cloud, we are able to further democratise AI so users can create more sophisticated AI apps that drive value at a global scale.”
Snowflake Cortex first announced support for industry-leading LLMs for specialised tasks such as sentiment analysis, translation, and summarisation, alongside foundation LLMs – starting with Meta AI’s Llama 2 model – for use cases including retrieval-augmented generation (RAG) at Snowday 2023. Snowflake is continuing to invest in its generative AI efforts by partnering with Mistral AI and advancing the suite of foundation LLMs in Snowflake Cortex, providing organisations with an easy path to bring state-of-the-art generative AI to every part of their business. To deliver a serverless experience that makes AI accessible to a broad set of users, Snowflake Cortex eliminates the long-cycled procurement and complex management of GPU infrastructure by partnering with NVIDIA to deliver a full stack accelerated computing platform that leverages NVIDIA Triton Inference Server among other tools.
With Snowflake Cortex LLM Functions now in public preview, Snowflake users can leverage AI with their enterprise data to support a wide range of use cases. Using specialised functions, any user with SQL skills can leverage smaller LLMs to cost-effectively address specific tasks such as sentiment analysis, translation, and summarisation in seconds. For more complex use cases, Python developers can go from concept to full-stack AI apps such as chatbots in minutes, combining the power of foundation LLMs – including Mistral AI’s LLMs in Snowflake Cortex – with chat elements, in public preview soon, within Streamlit in Snowflake. This streamlined experience also holds true for RAG with Snowflake’s integrated vector functions and vector data types, both in public preview soon, while ensuring the data never leaves Snowflake’s security and governance perimeter.
Snowflake is committed to furthering AI innovation not just for its customers and the Data Cloud ecosystem, but the wider technology community. As a result, Snowflake recently joined the AI Alliance, an international community of developers, researchers, and organisations dedicated to promoting open, safe, and responsible AI. Through the AI Alliance, Snowflake will continue to comprehensively and openly address both the challenges and opportunities of generative AI in order to further democratise its benefits.