LlamaCon: Meta's open-source love letter to developers

Audience listening in on a keynote session at a tech conference

Author

Anabelle Nicoud

Staff Writer

IBM

On Tuesday, Meta hosted LlamaCon, the first-ever conference dedicated to its family of AI models, Llama. It also announced the upcoming launch of the Llama API: a customizable, no-lock-in API that is currently only available in preview.

“You can now start using Llama with just one line of code,” said Chris Cox, Meta’s Chief Product Officer, on stage. The new Llama API comes with one-click key creation and interactive playgrounds, so developers can easily test different models, including Llama 4 Scout and Llama 4 Maverick. It’s fully customizable, with no lock-in. And it’s compatible with the OpenAI SDK, making it easy to plug into existing apps.

3D design of balls rolling on a track

The latest AI News + Insights 


Discover expertly curated insights and news on AI, cloud and more in the weekly Think Newsletter. 

A more accessible, developer-friendly API

With this announcement, Meta makes it clear that it cares about the developer experience, IBM experts say. The goal, according to Llama VP Manohar Paluri and Llama Researcher Angela Fan, is to find the fastest and easiest way to build with Llama—offering speed, yes, but also flexibility. Developers can fully customize models, take them anywhere and never have to worry about lock-in. Meta handles the infrastructure and inference, so teams can stay focused on building.

“Meta is basically opening their model through APIs; it’s another way to make it public and more accessible,” says Jacob Ben-David, a Global Cloud and AI Business Director at Red Hat, in an interview with IBM Think. “Before this, you either had to run it locally and figure out how to interact with it, or use a hosting provider, like Groq. I’ve been using some of the Llama models via API by connecting through GroqCloud, which was hosting a limited version of Llama. Now, they’re opening it up completely, and I think that gives developers much more accessibility.”

Also newly released is the Llama Stack integration with IBM. Red Hat will be serving as a partner on the connection, making it easier to deploy Llama for enterprise.

“I’m so happy to see them build a real developer ecosystem around Llama,” wrote Armand Ruiz, VP of AI Platform at IBM, on LinkedIn. “In today’s AI market, it’s not enough to just drop models on Hugging Face. To create real gravity, MOAT and stickiness, you need platforms, tools and community, and that’s exactly what Meta is doing.”

Finally, Meta CEO Mark Zuckerberg discussed a lightweight version of Llama—putting the focus on distillation in his interviews with Databricks CEO Ali Ghodsi and Microsoft CEO Satya Nadella.

“That connects well with IBM’s small, domain-specific model journey as well,” notes Shobhit Varshney, a Senior Partner and VP at IBM Consulting, in an interview with IBM Think.

Smart Talks

Redefining beauty through AI innovation

Malcolm Gladwell dives into the exciting collaboration between L'Oréal and IBM, exploring how a custom AI foundation model could revolutionize cosmetic product development and drive more innovation and sustainability.

Meta also announced that models from the Llama family have been downloaded 1.2 billion times. The conference as a whole served as a fresh reminder of Meta’s commitment to the open-source community, two years after the release of the first Llama model.

With Zuckerberg interviewing heavyweights in the industry, the conference felt like a “coming out party,” says Varshney. “Meta has a massive developer community now. So today was their first big developer conference, a coming out moment where they’re saying, ‘We’re here to help developers with the tools they need, so you don’t have to rely on others.’”

And last but not least, Meta announced the launch of a standalone Meta AI app. “There are already almost a billion monthly active users engaging with Meta AI across our different apps—and a lot of people using it daily,” said Zuckerberg. “But we figured some would enjoy having it as a standalone experience.”

Could Meta AI leverage the large user base the company has across its most popular social apps to compete with ChatGPT? Vyoma Gajjar, an AI Technical Solution Architect at IBM, thinks anything is possible.

“If Meta AI does it well, then maybe it can become the new ChatGPT. But it depends: everything comes down to execution,” she says.

Related solutions
IBM Granite

Achieve over 90% cost savings with Granite's smaller and open models, designed for developer efficiency. These enterprise-ready models deliver exceptional performance against safety benchmarks and across a wide range of enterprise tasks from cybersecurity to RAG.

Explore Granite
Artificial intelligence solutions

Put AI to work in your business with IBM's industry-leading AI expertise and portfolio of solutions at your side.

Explore AI solutions
AI consulting and services

Reinvent critical workflows and operations by adding AI to maximize experiences, real-time decision-making and business value.

Explore AI services
Take the next step

Explore the IBM library of foundation models in the IBM watsonx portfolio to scale generative AI for your business with confidence.

Discover watsonx.ai Explore IBM Granite AI models