ADVERTISEMENT

Microsoft CEO Satya Nadella Joins Meta's Threads; Here's What He Wrote In His First Post

Meta and Microsoft announced support for the Llama 2 family of large language models (LLMs) on Azure and Windows.

<div class="paragraphs"><p>Source: Mark Zuckerberg's Instagram</p></div>
Source: Mark Zuckerberg's Instagram

Microsoft Chairman and CEO Satya Nadella on Tuesday joined Meta's new app Threads as both the companies announced expansion of their AI partnership with Llama 2 on Azure and Windows.

"What a great day to join Threads!" Nadella wrote as he shared the news of both Meta and Microsoft coming together.

"Today, at Microsoft Inspire, Meta and Microsoft announced support for the Llama 2 family of large language models (LLMs) on Azure and Windows. Llama 2 is designed to enable developers and organizations to build generative AI-powered tools and experiences," said a Microsoft blog post shared by Nadella.

<div class="paragraphs"><p></p></div>

Source:Satya Nadella's Threads Account

"Grateful to have both you and Microsoft as a partner to get tools like Llama 2 in more people's hands," Zuckerberg replied.

Zuckerberg also posted a photo of himself along with Nadella on his Instagram account.

Opinion
The India Opportunity: The Mega Trends That Will Shape The Nation's Future

"Meta and Microsoft share a commitment to democratizing AI and its benefits and we are excited that Meta is taking an open approach with Llama 2," the blog post said.

"We offer developers choice in the types of models they build on, supporting open and frontier models and are thrilled to be Meta’s preferred partner as they release their new version of Llama 2 to commercial customers for the first time," it added.

Now Azure customers can fine-tune and deploy the 7B, 13B, and 70B-parameter Llama 2 models easily and more safely on Azure, the platform for the most widely adopted frontier and open models. In addition, Llama will be optimized to run locally on Windows. Windows developers will be able to use Llama by targeting the DirectML execution provider through the ONNX Runtime, allowing a seamless workflow as they bring generative AI experiences to their applications.

Read the full blog post here.

Opinion
JSW Steel Q1 Preview: Earnings To Grow Despite Sectoral Headwinds