A joint team from Google Research and DeepMind has developed a training method called SALT (Small model aided large model training) that cuts training time by up to 28 percent while improving performance. The key innovation? Using smaller language models as assistant teachers. The researchers also created an enhanced version called SALTDS that carefully selects training data, focusing on examples where the smaller model performs well.
Source: Google finds new way to train AI models using smaller ‘teacher’ models

Back in May, OpenAI said it was developing a tool to let creators specify how they want their works to be included in — or excluded from — its AI training data. But seven months later, this feature has yet to see the light of day. Called Media Manager, the tool would “identify copyrighted text, images, audio, and video,” to reflect creators’ preferences “across multiple sources.” It was intended to stave off some of the company’s fiercest critics, and potentially shield OpenAI from IP-related legal challenges.





