Tech News
← Back to articles

OpenAI's New Models Aren't Really Open: What to Know About Open-Weights AI

read original related products more articles

Despite the company's name, OpenAI hasn't dropped an open version of its AI models since GPT-2 in 2019. That changed on Tuesday, as CEO Sam Altman shared two new open-weights, reasoning AI models, named gpt-oss-120b (120 billion parameters) and gpt-oss-20b (20 billion parameters).

If open-weights is a new piece of AI jargon to you, don't worry. In the simplest possible terms, open-weights is a category of AI models that power products like chatbots, image and video generators. But they are philosophically different from the technology underpinning the AI tools you might use now. ChatGPT, Gemini and Copilot are all powered by closed models, which means we have no real insights into how those black box machines work. Open-weights models give us a peek at the mechanical wizard behind the curtain, so to speak.

You don't need to be a developer or machine learning expert to understand how these open models work or even to run them yourself. Here's everything you need to know about open-weights and open-source AI models.

What is an open-weights AI model?

All AI models have weights, which are characteristics or elements. Models are trained to give certain connections more weight, or value.

An open-weights model does exactly what its name implies -- the weights are publicly available, as defined by the Federal Trade Commission. Developers can see these weights and how they're used in the creation of AI models.

"Arguably, the most valuable thing in large [language] models is actually the weights. You can do a lot if you have the weights, which is somewhat different from traditional software," Omar Khattab, assistant professor of computer science at MIT and researcher in its computer science and artificial intelligence lab (CSAIL), told CNET.

For example, a chatbot is built to be really good at predicting the next logical word in a sentence. It's trained to string together words in its outputs that frequently show up next to each other in its training data, presumably in a logical order. Words that show up next to each other more frequently can be given more weight than words that don't often appear next to each other.

These weights are just numbers but open-weights models also come with a map.

"In open-weights [models], you get the weights, which are these numbers, and you get how to map those weights into the neural network structure, so the layers of the neural network, in order to actually be able to run it," said Khattab. The architecture of the model shows how a company structures its models, which is "incredibly valuable."

... continue reading