Can a frozen transformer still learn new moves?
In the world of artificial intelligence, bigger has often meant better. Large language models keep swelling, soaking up more data, parameters, and compute until they feel almost like a force of nature. Yet there’s a growing ache behind the thrill: how do we keep quality from exploding alongside cost? A team of researchers led by…