Dear Lemmy.world Community,

Recently posts were made to the AskLemmy community that go against not just our own policies but the basic ethics and morals of humanity as a whole. We acknowledge the gravity of the situation and the impact it may have had on our users. We want to assure you that we take this matter seriously and are committed to making significant improvements to prevent such incidents in the future. Considering I’m reluctant to say exactly what these horrific and repugnant images were, I’m sure you can probably guess what we’ve had to deal with and what some of our users unfortunately had to see. I’ll add the thing we’re talking about in spoilers to the end of the post to spare the hearts and minds of those who don’t know.

Our foremost priority is the safety and well-being of our community members. We understand the need for a swift and effective response to inappropriate content, and we recognize that our current systems, protocols and policies were not adequate. We are immediately taking immediate steps to strengthen our moderation and administrative teams, implementing additional tools, and building enhanced pathways to ensure a more robust and proactive approach to content moderation. Not to mention ensuring ways that these reports are seen more quickly and succinctly by mod and admin teams.

The first step will be limiting the image hosting sites that Lemmy.world will allow. We understand that this can cause frustration for some of our users but we also hope that you can understand the gravity of the situation and why we find it necessary. Not just to protect all of our users from seeing this but also to protect ourselves as a site. That being said we would like input in what image sites we will be whitelisting. While we run a filter over all images uploaded to Lemmy.world itself, this same filter doesn’t apply to other sites which leads to the necessity of us having to whitelist sites.

This is a community made by all of us, not just by the admins. Which leads to the second step. We will be looking for more moderators and community members that live in more diverse time zones. We recognize that at the moment it’s relatively heavily based between Europe and North America and want to strengthen other time zones to limit any delays as much as humanly possible in the future.

We understand that trust is essential, especially when dealing with something as awful as this, and we appreciate your patience as we work diligently to rectify this situation. Our goal is to create an environment where all users feel secure and respected and more importantly safe. Your feedback is crucial to us, and we encourage you to continue sharing your thoughts and concerns.

Every moment is an opportunity to learn and build, even the darkest ones.

Thank you for your understanding.


Sincerely,

The Lemmy.world Administration

Legal / ToS

spoiler

CSAM


  • Sterile_Technique@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 months ago

    What is the complete correct response for users to carry out if they spot CP?

    Is just the report button on the post good enough? Is there some kind of higher level report for bigger-than-just-one-instance shit that threatens Lemmy as a whole? Should we call the FBI or some shit?

    I haven’t come across any here, but if I do, I’d like to be able to aid in swift action against not only the post/account in question, but against the actual person running it.

    • RightHandOfIkaros@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 months ago

      In theory, reporting to the community moderators should be enough for users. It would then be the responsibility of thost moderators to report it to the instance admin, and then the admins responsibility to report it to the instance’s local law enforcement. They will then handle it appropriately.

      However, sometimes community moderators are corrupt and will ignore reports and even ban users for reporting instance rule breaking content. In those cases, the user must report directly to the instance admin. As you can imagine, instance admins also can be corrupt and therefore the user must report to law enforcement.

      But typically the first scenario is sufficient.

      • Thekingoflorda@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        FYI: admins can see all reports. We currently have a tool running that scans for posts that are reported a lot, which will then notify people who can do something about it.

  • lurch (he/him)@sh.itjust.works
    cake
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    Well thanks for the spoiler thing, but I don’t even know what the acronym (is it even an acronym?) means anyway and now I’m too afraid to do a web search for it 😅

    Well, maybe it’s better that way.

    • PervServer@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 months ago

      ::: spoiler CSAM is child sexual abuse material I believe. So yeah, better not to look it up :::

      • humorlessrepost@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 months ago

        I’m pretty convinced the initialism was created so that people could Google it in an academic context without The Watchers thinking they were looking for the actual content.

        • tpihkal@lemmy.world
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          8 months ago

          You may be correct although it seems like pretty dumb reasoning. I doubt any of those cretins would search the words “child sexual abuse material.” That would require acknowledging the abuse part of it.

          • forrgott@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            8 months ago

            I think you may have misunderstood. The entire point is to have an academic term that would never be used as a search by one of those inhuman lowlifes.

            I don’t mean to be pedantic, so I hope my meaning came across well enough…

            • tpihkal@lemmy.world
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              8 months ago

              I think my point is that the acronym exists because of the search term, not the other way around. And it’s pretty laughable that the academic term has to be distilled down to an acronym because it is otherwise considered a trigger word.

  • Dankry@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    8 months ago

    That being said we would like input in what image sites we will be whitelisting.

    I’d like to suggest postimages(dot)org. I’ve been using that site since leaving reddit/imgur over the summer. They seem to be a good free service (although they do offer a premium tier) and according to their ‘about us’ section they’ve been operating for 20 years.

    • m-p{3}@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      8 months ago

      As one of the smaller instance admins who had to deal with the content… one of the image host was postimg.cc, so I’m not sure how fast they take down the content or if they run some kind of filtering.