• Yote.zip@pawb.social
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    6
    ·
    11 months ago

    Hey this is your friendly reminder to spread out in the Fediverse. Stop making communities on the big servers. Now all those users just lost a big chunk of content and they’re likely to leave Lemmy and spread the word about how the Fediverse will never work because of trigger-happy admins.

    • Blaze@discuss.tchncs.de
      link
      fedilink
      arrow-up
      19
      arrow-down
      3
      ·
      11 months ago

      On the other side, you still want an instance that will last long enough.

      We lost lemmy.film a few weeks back, that was a loss for everyone invested in the main community (about movies and films).

      But I agree with you that we should spread about communities across servers. I like lemm.ee and sh.itjust.works, the admins seem pretty chill and really use defederation as last resort.

      • ram@bookwormstory.social
        link
        fedilink
        English
        arrow-up
        10
        ·
        edit-2
        11 months ago

        Personally I prefer sticking with smaller instances of maybe a few hundred or a thousand users. The more evenly spread out we are across instances, the more democratized the federation is.

    • AMillionNames@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      11 months ago

      The problem with that is that you aren’t going to comment the same thing across different instances of what’s basically the same community, you generally want to engage with the most users. I recently got banned under false accusations, and it’s pretty easy to see when it happened because the amount of upvotes and engagement dropped drastically between comments to the same instance - only the people from my instance were seeing it now.

      The next step for lemmy might be the concept of mirrored communities where comments are automatically propagated across instances if they belong to the same owners and they have it enabled. Admins would control access/visibility for users of their instance, and the community owners would control the access/visibility of all instances they’ve reserved the community under. Admins could just decide to remove all the moderators/de-federate the community in the instance they control to sever the mirroring and create their own, but it might still help the smaller instances to get going.

      I’m not sure how admins, specially the ones who are ok with lying to their users, would be ok with it, and it’s meaningless if they just wield their charisma and taint those communities as well. So far, they are pretty blatant, yet admins either aren’t bothering to check the evidence or they simply don’t want to de-federate, which is just another way of condoning their behavior to avoid risking user engagement. Adding mirrored communities into the mix may just not solve it because the problem is still there: a divided user base who’s getting treated like cattle without them knowing.

  • jsdz@lemmy.ml
    link
    fedilink
    arrow-up
    41
    arrow-down
    5
    ·
    edit-2
    11 months ago

    It was added to the “exclude” list in an apparently unrelated commit three days ago with absolutely no explanation. Glancing at its front page I see nothing objectionable, just a lot of anime stuff. When challenged u/dessalines had nothing to say other than “no, that is full of CSAM” and just closed the discussion without further comment.

    Unless some more info comes to light it does not look good. Probably as good a time as any to depart from lemmy.ml.

    • Izzy@lemmy.ml
      link
      fedilink
      arrow-up
      8
      arrow-down
      6
      ·
      edit-2
      11 months ago

      Probably as good a time as any to depart from lemmy.ml.

      If the devs / admins of lemmy.ml can’t be trusted and the admins of lemmy.world are abusive then it is safe to say the experiment called Lemmy has failed. There is no recovery from the top 2 instances which make up most of the “content” are not worth supporting. I could go to another instance and block lemmy.world and lemmy.ml once the BE 0.19.0 update rolls out, but then the site is just dead. It’s already pretty much like talking to the wind. but the site would be truly empty at that point.

      I started noticing the trend of instances defederating into little islands months ago, but it seems obvious at this point that it the concept of federation isn’t going to work out well. The easily self hostable part is still nice even if it eventually ends up as singular instances with maybe 1 or 2 federated connections that actually post things. There will be a lot of instances that have nothing, but I don’t think that really counts.

      • fred@lemmy.ml
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        11 months ago

        You could go to another instance and remain federated with both of those and ani.social. Whether they defederate with ani.social or not doesn’t stop you from engaging with them.

        • Izzy@lemmy.ml
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          11 months ago

          I understand that this is possible. If it were some bad community moderators I would just avoid those communities. If the entire instance is tainted then I wouldn’t want to engage with it even if the instance is federated.

      • cacheson@kbin.social
        link
        fedilink
        arrow-up
        14
        arrow-down
        2
        ·
        11 months ago

        Literally any evidence at all beyond “dessalines said so” would be a good start. Hell, even dessalines specifically describing what he saw would be great.

        • density@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          14
          ·
          11 months ago

          So you want a link to csam or admission from someone thst they seaech for and viewed csam?

          • cacheson@kbin.social
            link
            fedilink
            arrow-up
            10
            arrow-down
            2
            ·
            11 months ago

            What are you on about? Dessalines said “No, that is full of CSAM.” I would like to know how they came to that conclusion.

      • jsdz@lemmy.ml
        link
        fedilink
        arrow-up
        7
        ·
        edit-2
        11 months ago

        I wouldn’t need to be wholly convinced that there’s anything heinous going on over there, just that the person accusing them of it had good reason to think so. So pretty much anything more than no info at all would probably have done the trick. Anyway, thanks for putting up with me for a little while and good luck to everyone at lemmy.ml, but I’m outta here. I’ll probably go try kbin or something.

  • wellheh@lemmy.sdf.org
    link
    fedilink
    arrow-up
    35
    arrow-down
    3
    ·
    11 months ago

    What is with this awful title? There’s no evidence “found” that there was any CSAM

  • Izzy@lemmy.ml
    link
    fedilink
    arrow-up
    43
    arrow-down
    14
    ·
    11 months ago

    Might as well disallow all NSFW content if naked anime girls is going to be considered CSAM. Relating these two things is making light of a real problem.

    • ram@bookwormstory.social
      link
      fedilink
      English
      arrow-up
      24
      ·
      11 months ago

      IDK where the lemmy admins are based out of, but many countries consider hentai depicting underage characters to be illegal; my country of Canada’s one such country.

    • Yote.zip@pawb.social
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      4
      ·
      11 months ago

      For the record Lemmy.ml does actually disallow NSFW, and they defederated from Yiffit.net (a general furry instance) because it has NSFW communities. Tread carefully around them or they’ll remove you from the Lemmyverse, is the apparent message.

      • Izzy@lemmy.ml
        link
        fedilink
        arrow-up
        12
        arrow-down
        1
        ·
        11 months ago

        How would federation work in that case? Are they going to defederate any instance that has NSFW content? By their own definition I’ve found CSAM on lemmy.world and every other instance that has NSFW communities.

        • Yote.zip@pawb.social
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          11 months ago

          That seems to be their goal, though they are probably targeting specific instances that they notice most often. I think Yiffit tried to convince them to just block NSFW content or just specific communities instead of defederating entirely but apparently that didn’t work - I’m not in the loop on how the conversation went.

          • Izzy@lemmy.ml
            link
            fedilink
            arrow-up
            3
            ·
            11 months ago

            It’s interesting they even programmed the ability to flag communities and posts as NSFW and turn it off in user settings if they didn’t want any NSFW content to be federated with them.

  • Veraxus@kbin.social
    link
    fedilink
    arrow-up
    30
    arrow-down
    6
    ·
    edit-2
    11 months ago

    Actual CSAM, depicting an actual crime against an actual child… or “someone drew dirty cartoons and I’m a moron who thinks dirty drawings are the same as one of the most heinous crimes imaginable - harming a vulnerable child”?

      • Veraxus@kbin.social
        link
        fedilink
        arrow-up
        14
        arrow-down
        7
        ·
        edit-2
        11 months ago

        This is a very simple calculation for me. I follow the “golden rule of liberty”, which can also be called the “harm principle.”

        That is, “your rights end where mine begin. my rights end where yours begin”. Or, it is unethical to restrict anyone’s freedoms/liberties (especially expression) if they are not inflicting harm on others (i.e. infringing on their rights).

        Furthermore, I object to any level of subjective analysis of the “legality” of art. Ergo, the mindset of “I think this looks childish, therefore it is a child, therefore it is CP” is exceptionally unethical and should not be tolerated.

        And moreso, all of this only muddies and minimizes the ACTUAL crime of abusing children and diverts resources away from protection of real, actual children all because of some inane moralizing over someone’s artwork.

        You are too hung up on “punishing the immoral” to realize that the ACTUAL need is “protecting the vulnerable” - and those two things are NOT the same.

        • JohnDClay@sh.itjust.works
          link
          fedilink
          arrow-up
          12
          arrow-down
          1
          ·
          11 months ago

          This has nothing to do with legality or restricting freedoms. It’s about the admins building a forum to the form they’d like to see.

          Plus the harm principle is really fuzzy. What level of interaction is the cutoff for harm vs inadvertent impact?

          I also think drawn kiddie porn hurts the people who view it inadvertently. I don’t know if there have been studies on a causation link between viewed cp content and sexual preferences towards minors, or causal relationship between that and abuse/grooming. But that’s another possible harm connection.

          • wellheh@lemmy.sdf.org
            link
            fedilink
            arrow-up
            2
            ·
            11 months ago

            Honestly this argument reminds me of the age old claim that “video games causes violence” because people thought glorifying violence in video games would get you to shoot people. In reality, there is still no link between gaming and violence. Sick people hurt other people and blaming art for lack of responsibility is sad.

            • JohnDClay@sh.itjust.works
              link
              fedilink
              arrow-up
              2
              arrow-down
              1
              ·
              11 months ago

              That’s why I said there’d need to studies on a causation link for this specifically. I know video games have had those studies done and found that there isn’t a link. So you’d want a similar study for this. But there’s still the accidently stumbling across it issue too.

    • Uranium3006@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      4
      ·
      11 months ago

      loli content (CSAM)

      in this case, the latter.

      people get more offended by cartoons than actual child abuse I swear

  • JohnDClay@sh.itjust.works
    link
    fedilink
    arrow-up
    15
    ·
    11 months ago

    Does anyone know if ani does have a lot of csam? Their rule against it seems pretty robust. Was there stuff getting by the rule, or do the .ml admins have a more broad definition of csam?

    1. Do not submit content depicting a child (both real and virtual) engaged or involved in explicit sexual activities. A child is defined as a person who is under 18 years old; or a person, regardless of age, who is presented, depicted or portrayed as under 18 years old.
    • WadamT@lemmy.ml
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      2
      ·
      11 months ago

      I am also on ani.social but never seen any CSAM contents there. Note there are safe-to-view anime loli character meme posts there.

      • JohnDClay@sh.itjust.works
        link
        fedilink
        arrow-up
        7
        arrow-down
        2
        ·
        11 months ago

        Maybe the .ml admins classified sfw loli as csam? I don’t think I’ve seen any on ani, but I did see some sfw loli foot stuff somewhere that was disturbing. Maybe it was something like that?

        • Metal Zealot@lemmy.ml
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          11 months ago

          Wikipedia:

          In Japanese popular culture, lolicon (ロリコン, also romanized as rorikon or lolicom) is a genre of fictional media in which young (or young-looking) girl characters appear in romantic or sexual contexts.

          You are seriously not trying to say that there is such a thing as safe-for-work underaged porn, are you?

    • cacheson@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      3
      ·
      11 months ago

      or do the .ml admins have a more broad definition of csam?

      Their definition seems to be “I don’t like anime”.

    • Veraxus@kbin.social
      link
      fedilink
      arrow-up
      10
      arrow-down
      4
      ·
      11 months ago

      I’m guessing this is yet another tiring instance of some idiot thinking “manga + nudity = CP”