KINGFISHER INVESTMENT RESEARCH


KINGFISHER INVESTMENT RESEARCH



Wednesday, January 29, 2025

Deepseek and Distillation

 


Asked if China's DeepSeek stole American IP, AI Czar David Sacks says it looks like a technique called distillation was used where a student model can "suck the knowledge" out of the parent model and there is evidence that DeepSeek distilled knowledge from OpenAI's models, which is leading to efforts to develop techniques to prevent future copycats.


Distillation is a common practice in the AI industry but the concern was that DeepSeek may be doing it to build its own rival model, which is a breach of OpenAI’s terms of service. Here’s why: ft.trib.al/NydPc2b

Image



No comments:

Post a Comment