Connect with us

Tech

AI Wrapped: The 14 AI terms you couldn’t avoid in 2025

Published

on

[ad_1]

The key to R1’s success was distillation, a technique that makes AI models more efficient. It works by getting a bigger model to tutor a smaller model: You run the teacher model on a lot of examples and record the answers, and reward the student model as it copies those responses as closely as possible, so that it gains a compressed version of the teacher’s knowledge.  —Caiwei Chen

10. Sycophancy

As people across the world spend increasing amounts of time interacting with chatbots like…

[ad_2]

Source link

Continue Reading