• wander1236@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    7 months ago

    It has to be stored in some form for the AI to “learn” from and remember it, and a lot of the debate is around whether AI is actually able to learn, or if it can only really blindly combine 1:1 copies of elements into something derivative.

    There’s also the debate of whether what humans learn and produce based on influence can be compared to AI, but humans aren’t able to consume millions of records in seconds like AI.

    • Lemminary@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      7 months ago

      They’re not storing the original data and OpenAI even state so themselves. LLMs compound derived associations between words and concepts from whatever it analyzes, which is further modified by all the other sources it analyzes and that’s what gets stored during training. It doesn’t matter if it’s a few sources or a million sources, it’s not storing any of it as-is. It’s very much like how we process information ourselves for the length of our entire lives by making generalizations. We don’t memorize everything precisely besides the foundational blocks of language, but our neurons do fire in a certain pattern when given a trigger. How is that stealing?