Tech News
← Back to articles

Meta's New AI Models Aren't Llamas, but They Are Used in Wildlife Conservation Research

read original related products more articles

Meta just released the third generation of its SAM series, which stands for "segment anything models." These AI models are focused on visual intelligence, and they will power improvements to how you edit content on Meta's platforms.

These aren't large language models that power chatbots, and they aren't part of Meta's Llama family, though Llama models were used in their creation. Instead, the SAM offerings are AI models that are really good at object detection. The SAM 3D ones are better at specific objects, like the human body. This kind of visual intelligence is one way that AI models are being created to better understand our physical world.

Meta trained SAM 3 on a huge dataset of content, matching images and videos with text descriptions of them. So if you were to click on one elephant in a photo, SAM 3 can analyze the image and highlight all the elephants in the image. You can also do this with text, like asking the model to highlight "red caps" or "everyone sitting down." The specificity of these requests is what the new model is supposed to be good at handling.

Here's an example of how the new AI model can identify specific objects in an image, in this case, people. Meta/Screenshot by Katelyn Chedraoui

These aren't image or video generation models, so if you're not a developer, you probably don't need to see or use them. (If you're interested, you can see the open-weights models on the new segment anything playground.) But you should start to see improvements in content editing tools across Meta's platforms soon, thanks to SAM 3.

The company is using the new models to power more precise editing in its Instagram Edits video editor app and in Vibes, its AI video app. You can select multiple objects and apply edits in a batch. The Facebook Marketplace "view in a room" feature will use SAM 3 to show you how things that people are selling, like furniture, would look in your home.

One example of how these models can be used in real life is in wildlife conservation research. Meta partnered with two wildlife monitoring companies, ConservationX and Osa Conservation, to build a database of "research-ready, raw video footage" from over 10,000 cameras, capturing over 100 species, according to the press release. SAM 3 models helped analyze videos and better identify the animals.

Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.

These models are from Meta's Superintelligence Labs. Meta's AI ambitions fueled a billion-dollar effort to poach the best AI leaders and developers from other AI companies earlier this summer. But Meta's AI teams have faced significant challenges, leading the company to fire 600 workers from its AI unit in late October. Recent reporting from the Financial Times indicates Meta's chief AI scientist and AI pioneer Yann LeCun is planning on leaving Meta to start his own company.

For more, check out what to know about Meta's Llama 4 models and how to mute Meta AI.