Deputy prime minister to urge UN general assembly to create international regulatory system

      • randon31415@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 months ago

        Turkey famously required a 3 day review before you could post a reply in an internet forum back in the 90s. They thought forums were like dueling letters to the editor.

      • Gsus4@feddit.nl
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        But are they all dinossaur paper-based lawyers or just very focused on specific issues so they miss the bigger picture or are they actually advised by expert panels but meetings take too long…or nobody cares…or everyone is compromised by other interests…what?

  • ShittyBeatlesFCPres@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    10 months ago

    Honestly, an easy way to regulate generative A.I. is to just pretend the output was made by a person. If your “A.I.” is used to create a deepfake political ad, you should be fined or sued as if you had an intern make it. If you aren’t sure the LLM won’t hallucinate falsehoods, don’t use it for news articles unless you’re ok with libel laws being applied.

    • PupBiru@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      10 months ago

      it’s kinda exactly the same as someone on the street handing you a bit of paper with a rumour on it and you publishing it without checking that it’s correct

  • hereisoblivion@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    10 months ago

    Computers, the Internet, and the whole of IT have been moving too fast for regulators to keep up since the 90s. They are slower than a tortoise walking through molasses with a blindfold on.

    But what can you expect when those who make regulations over IT still don’t know how to change the time on their VCR?

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      10 months ago

      That’s really not the differentiating factor.

      Easily 80% of young people loudly commenting on the topic online have no idea about nuances as centrally relevant to where the tech is going as “Do Large Language Models learn world models or just surface statistics?”.

      I see a ton of young people patting themselves on the back regurgitating what at this point is clear misinformation about stochastic parrots and remixing content that they picked up from similarly poorly informed tech writers with skin in the game, oblivious to the emerging picture in ongoing research.

      It is moving too quickly.

      Just today I was reading a paper on using CoT prompting (research from 2022) to efficiently transfer domain knowledge from a larger model to a much smaller model which then outperforms the original.

      What that’s going to mean for Meta’s open sourced models, for the market for synthetic data, for the practical limitations on the impact of IP cases - is wild.

      And that’s just this week’s news.

      It’s way too much too quickly.

      Keep in mind that the average practicing doctor is on average 17 years out of touch with the most recent research.

      To expect a politician of any age to have a solid grasp on this stuff isn’t practical.

      There are a number of trends in the research that can be reasonably predicted, but I’ve never seen a field moving this fast.

      The very idea of trying to predict the situation even five years out is ludicrous. By the time legislation proposed today is being passed, it’s going to be obsolete.

      Regulators are screwed.