OpenAI returns to its open-source roots with new open-weight AI models, and it's a big deal
Briefly

OpenAI returns to its open-source roots with new open-weight AI models, and it's a big deal
"OpenAI has unveiled two new open-weight language models, gpt-oss-120b and gpt-oss-20b, marking its first public release of freely available AI model weights since GPT-2 in 2019."
"Open-weight models enable anyone to download, examine, run, or fine-tune the LLM, eliminating the need to rely on remote cloud APIs or expose in-house sensitive data."
OpenAI announced the release of two new open-weight language models, gpt-oss-120b and gpt-oss-20b, the first since GPT-2 in 2019. These models allow anyone to download, inspect, and fine-tune them, avoiding reliance on cloud APIs and maintaining data security. The shift marks a significant change from OpenAI's previous focus on proprietary models driven by corporate partnerships. The gpt-oss-120b is designed for high-performance servers, while the gpt-oss-20b is suited for laptops, provided users meet specific hardware requirements. OpenAI did not release training data due to regulatory and safety concerns.
Read at ZDNET
Unable to calculate read time
[
|
]