• MaximilianKohler@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    9 months ago

    Lemmy has pretty much all the same problems as reddit does but at a much smaller scale because it’s just not as big. Would you suggest Google use Lemmy?

    I agree, and I covered that in my blog. Lemmy is astroturfed and may even be easier to astroturf than reddit. I would like to see a more diversified “discussions and forums”, that’s not just reddit links.

    In general, privately-owned forums (running Xenforo, etc.) seem much better run than most reddit subs. I have never experienced the plethora of problems with reddit, on forums. I think it’s harder to spam and astroturf forums, and the owners & moderators have different incentives than reddit mods.

    The bar to entry as a new person on smaller forums was often high.

    I don’t remember experiencing that, but it makes me think of the bar to entry for running a reddit sub. Anyone can instantly create one for free and do whatever they want with it and get on the top of search results pretty quickly. Setting up your own forum is a lot more difficult and more of a commitment. I think there are benefits to that.

    I agree with your last paragraph. I think the type of warnings Twitter implemented are a decent idea. I think in general people need more warnings that what they see on reddit and other social media is not policed for legal content – people can and do say whatever they like, and much of what people say is misinformation and disinformation.

    I don’t think most people realize that reddit and other social media platforms have no obligation to take down illegal content. People seem WAY too trusting of things they read on reddit. If Google is going to be highlighting reddit results and putting them at the top, then they bear some responsibility for this.

    Since the CDA’s passage in 1996, § 230© has been consistently interpreted by U.S. courts to provide broad immunity to platforms for hosting and facilitating a wide range of illegal content—from defamatory speech to hate speech to terrorist and extremist content.12 Notice of illegal content is irrelevant to such immunity.13 Thus, even if a platform like YouTube is repeatedly and clearly notified that it is hosting harmful content (such as ISIS propaganda videos), the platform remains immune from liability for hosting such harmful content.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      9 months ago

      Something else to note. Reddits posting as it stands now will archive a post if it’s six months or older meaning you can’t upvote or downvote, or even respond with a comment to a post. So if what the author of your post is saying is true, that means the entities engaging in commenting affiliate links and then artificially inflating their comments karma with intent should be down to the moderators themselves.

      One thing that I don’t see taken into account is the number of moderators who’s subreddits were forcibly taken over, or who lost significant and powerful moderation tools when the Reddit API Fiasco went down last June. This article paints all moderators with the same brush and given how reddit has cannibalized their ability to moderate appropriately, it remains unclear how much of this problem is down to the character of the moderators (love or hate them, because I am largely indifferent). Given the nature of moderation on most online communities, and the fact that it is usually volunteer based, is the question whether or not we should exclude reddit, or is it that we should change the way message boards and their media are regulated to better align with consumer protections? Shouldn’t we be pushing for legislation that would punish the spread of misinformation? Or is that an overstep of government authority?

      Also, reddit didn’t become this behemoth of user generated reviews overnight, and neither will any other community. Becoming successful in the space is somewhat on Google but they are a profit driven company so basically if we (consumers) rely on them with no government oversight, only communities that can pay to play will show up in the results. That’s a problem with Google and search engines just like it, not necessarily with reddit, though their misinformation train does bear some brunt of the blame for how this problem presents.

    • atrielienz@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 months ago

      To be clear, since I don’t think my meaning was clearly explained, I meant the the bar for entry on smaller forums outside of reddit. Reddit has generally had problems with high karma accounts bullying new accounts by taking advantage of the fact that new accounts are viewed (and have always been viewed) as less credible. But on private forums I was a part of in the early oughts and even the late 90’s, there were problems with treating newcomers of any stripe with distrust. Every time I joined a new tech forum back then that was the case. It was used as an anti spam, anti-troll checks and balances sort of system. To build karma was to be allowed the benefit of interaction outside the use of upvotes or downvotes. While it might have been effective (in the same way the invite tokens or similar measures are) it was also very exclusive and sort of made me feel unwelcome in the space. Part of the reason reddit grew in popularity was because it doesn’t have that unwelcoming feeling to the same extent because a lot of those measures just aren’t in place.