OpenAI has re-entered the open-source AI community with the release of two new models, gpt-oss-120b and gpt-oss-20b, their first contribution since 2019’s GPT-2. These models, built using the Mixture-of-Experts (MoE) architecture, are designed to offer performance comparable to OpenAI’s o3 and o3-mini.
The San Francisco-based AI firm emphasizes the rigorous safety training and evaluation these models have undergone. This focus suggests OpenAI is prioritizing responsible AI development as they re-engage with the open-source movement.
The open weights of both gpt-oss models are now available for download via Hugging Face, enabling researchers and developers to utilize and further explore these new advancements. This move signals a willingness from OpenAI to share its technology and contribute to the broader AI ecosystem, while also potentially fostering innovation and collaboration within the field.








