• Lemminary@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    They’re not storing the original data and OpenAI even state so themselves. LLMs compound derived associations between words and concepts from whatever it analyzes, which is further modified by all the other sources it analyzes and that’s what gets stored during training. It doesn’t matter if it’s a few sources or a million sources, it’s not storing any of it as-is. It’s very much like how we process information ourselves for the length of our entire lives by making generalizations. We don’t memorize everything precisely besides the foundational blocks of language, but our neurons do fire in a certain pattern when given a trigger. How is that stealing?