• OsrsNeedsF2P@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    On average, around 3.5 new videos were uploaded to the platform every hour, many of which were previously unknown to law enforcement.

    Absolutely sick and vile. I hope they honey potted the site and that the arrests keep coming.

  • TJC@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Maybe Jeff Bezos will write an article about him and editorialize about “personal liberty”. I have to keep posting this because every day another MAGA/lover - religious bigot or otherwise pretend upstanding community member is indicted or arrested for heinous acts against women and children.

  • PerogiBoi@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    Fuck man. I used to use a program called “Kidpix” when I was a kid. It was like ms paint but with fun effects and sounds.

  • clearedtoland@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    With everything going on right now, the fact that I still feel physically sick reading things like this tells me I haven’t gone completely numb yet. Just absolutely repulsive.

    • unphazed@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      I know I’m not heartless yet because I am still traumatized by the brick in the window video…

      • BumpingFuglies@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 days ago

        I’ll probably regret asking, but I’m out of the loop and insatiably curious.

        Brick in the window video?

        • Miles O'Brien@startrek.website
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 days ago

          If it’s what I’m thinking of, camera footage of a vehicle interior.

          Driving down the highway, going under an overpass when a brick gets tossed by some kids and goes through the window.

          Passenger hit, husband is driving and screams.

          You know that scream they mention in The Princess Bride? That “only someone experiencing ultimate suffering” can make?

          If you know, you know.

          • aviationeast@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            10 days ago

            I’ve never seen that video but I can hear that scream from the husband. That’s some fucked up shit.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            10 days ago

            It’s not an overpass. A loose brick falls off a truck going in the opposite direction, bounces off the pavement once, then goes through the windshield.

            Edit: oh hurray, there’s two different brick videos.

            • Miles O'Brien@startrek.website
              link
              fedilink
              English
              arrow-up
              0
              ·
              9 days ago

              Well, I know what other video I’m never watching.

              And people wonder why I don’t like being around any vehicle that carries things…

        • HappyFrog@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          They’re probably referencing the video where a woman was killed after a brick flew through the windshield. I haven’t watched it, but it is on YouTube and I’ve heard that the husband’s cries are not so nice.

          I don’t remember if it was kids throwing bricks off of a bridge or if it was something else.

        • FauxLiving@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 days ago

          To sanitize the traumatic video as much as possible: A man is driving under an overpass and a brick is dropped through the passenger side window instantly killing his wife. He reacts in horror.

  • GnuLinuxDude@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Every now and again I am reminded of my sentiment that the introduction of “media” onto the Internet is a net harm. Maybe 256 dithered color photos like you’d see in Encarta 95 and that’s the maximum extent of what should be allowed. There’s just so much abuse from this kind of shit… despicable.

    • TankovayaDiviziya@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      It is easy to very feel disillusioned with the world, but it is important to remember that there are still good people all around willing to fight the good fight. And it is also important to remember that technology is not inherently bad, it is a neutral object, but people could use it for either good or bad purposes.

    • adhdplantdev@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      I think it just shows all the hideousness of humanity and all it’s glory in a way that we have never confronted before. It’s shatters the illusion the humanity has grown from its barbaric ways.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      Raping kids has unfortunately been a thing since long before the internet. You could legally bang a 13 year old right up to the 1800s and in some places you still can.

      As recently as the 1980s people would openly advocate for it to be legal, and remove the age of consent altogether. They’d get it in magazines from countries where it was still legal.

      I suspect it’s far less prevalent now than it’s ever been. It’s now pretty much universally seen as unacceptable, which is a good start.

      • mic_check_one_two@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        The youngest Playboy model, Eva Ionesco, was only 12 years old at the time of the photo shoot, and that was back in the late 1970’s… It ended up being used as evidence against the Eva’s mother (who was also the photographer), and she ended up losing custody of Eva as a result. The mother had started taking erotic photos (ugh) of Eva when she was only like 5 or 6 years old, under the guise of “art”. It wasn’t until the Playboy shoot that authorities started digging into the mother’s portfolio.

        But also worth noting that the mother still holds copyright over the photos, and has refused to remove/redact/recall photos at Eva’s request. The police have confiscated hundreds of photos for being blatant CSAM, but the mother has been uncooperative in a full recall. Eva has sued the mother numerous times to try and get the copyright turned over, which would allow her to initiate the recall instead.

  • mic_check_one_two@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 days ago

    Here’s a reminder that you can submit photos of your hotel room to law enforcement, to assist in tracking down CSAM producers. The vast majority of sex trafficking media is produced in hotels. So being able to match furniture, bedspreads, carpet patterns, wallpaper, curtains, etc in the background to a specific hotel helps investigators narrow down when and where it was produced.

    https://traffickcam.com/

        • Snowclone@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          I worked in customer service a long time. No one was trained on how to be law enforcement and no one was paid enough to be entrusted with public safety beyond the common sense everyday people have about these things. I reported every instance of child abuse I’ve seen, and that’s maybe 4 times in two decades. I have no problem with training and reporting, but you have to accept that the service staff aren’t going to police hotels.

  • j0ester@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    They also seized 72,000 illegal videos from the site and personal information of its users, resulting in arrests of 1,400 suspects around the world.

    Wow

  • taladar@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 days ago

    Does it feel odd to anyone else that a platform for something this universally condemned in any jurisdiction can operate for 4 years, with a catchy name clearly thought up by a marketing person, its own payment system and nearly six figure number of videos? I mean even if we assume that some of those 4 years were intentional to allow law enforcement to catch as many perpetrators as possible this feels too similar to fully legal operations in scope.

    • swelter_spark@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      It definitely seems weird how easy it is to stumble upon CP online, and how open people are about sharing it, with no effort made, in many instances, to hide what they’re doing. I’ve often wondered how much of the stuff is spread by pedo rings and how much is shared by cops trying to see how many people they can catch with it.

      • Ledericas@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 days ago

        it can hide in plain sight, and then when you dig into someones profile, it can lead to someone or a group discussing CSAM and beastility, not just CP. like a site similar to r/pics, or porn site. yea sometimes you stumble into a site like that, but it seems to occur when people search for porn outside of the Pornhub and affiliates sites. remember PH sanatized thier site because of this. last decade there was article about an obscure site that was taken down, it had reddit like porn subs,etc. then people were complaining about the csam, and nothing was done about it. it was eventually taken down for legal reasons, thats not related to csam.

        • swelter_spark@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          I can definitely see how people could find it while looking for porn. I don’t understand how people can do this stuff out in the open with no consequences .

          • Ledericas@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            8 days ago

            yea its often hidden too well to be easily found out and authorities might want to gather evidence so they let it accumulate then they pounce. one of the sites was mostly inneundos and talking about commiting it, but not actually distributing the material, they co-opt certain images to pervert it. other deviancies like beastiality were also present.

      • Cryophilia@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        If you have stumbled on CP online in the last 10 years, you’re either really unlucky or trawling some dark waters. This ain’t 2006. The internet has largely been cleaned up.

        • Ledericas@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          most definitely not clean lmao, your just not actively searching for it, or stumbling onto it.

        • LustyArgonian@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          9 days ago

          Search “AI woman porn miniskirt,” and tell me you don’t see questionable results in the first 2 pages, of women who at least appear possibly younger than 18. Because AI is so heavily corrupted with this content en masse, this has leaked over to Google searches in most porn categories being corrupted with AI seeds that can be anything.

          Fuck, the head guy of Reddit, u/spez, was the main mod of r/jailbait before he changed the design of reddit so he could hide mod names. Also, look into the u/MaxwellHill / Ghilisaine Maxwell conspiracy on Reddit.

          There are very weird, very large movements regarding illegal content (whether you intentionally search it or not) and blackmail and that’s all I will point out for now

          • Cryophilia@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 days ago

            Search “AI woman porn miniskirt,”

            Did it with safesearch off and got a bunch of women clearly in their late teens or 20s. Plus, I don’t want to derail my main point but I think we should acknowledge the difference between a picture of a real child actively being harmed vs a 100% fake image. I didn’t find any AI CP, but even if I did, it’s in an entire different universe of morally bad.

            r/jailbait

            That was, what, fifteen years ago? It’s why I said “in the last decade”.

            • LustyArgonian@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              8 days ago

              “Clearly in their late teens,” lol no. And since AI doesn’t have age, it’s possible that was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.

              Obviously there’s a difference with AI porn vs real, that’s why I told you to search AI in the first place??? The convo isn’t about AI porn, but AI porn uses images to seed their new images including CSAM

              • Schadrach@lemmy.sdf.org
                link
                fedilink
                English
                arrow-up
                0
                ·
                3 days ago

                was seeded with the face of a 15yr old and that they really are 15 for all intents and purposes.

                That’s…not how AI image generation works? AI image generation isn’t just building a collage from random images in a database - the model doesn’t have a database of images within it at all - it just has a bunch of statistical weightings and net configuration that are essentially a statistical model for classifying images, being told to produce whatever inputs maximize an output resembling the prompt, starting from a seed. It’s not “seeded with an image of a 15 year old”, it’s seeded with white noise and basically asked to show how that white noise looks like (in this case) “woman porn miniskirt”, then repeat a few times until the resulting image is stable.

                Unless you’re arguing that somewhere in the millions of images tagged “woman” being analyzed to build that statistical model is probably at least one person under 18, and that any image of “woman” generated by such a model is necessarily underage because the weightings were impacted however slightly by that image or images, in which case you could also argue that all drawn images of humans are underage because whoever drew it has probably seen a child at some point and therefore everything they draw is tainted by having been exposed to children ever.

                • LustyArgonian@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  1 day ago

                  Yes, it is seeded with kids’ faces, including specific children like someone’s classmate. And yes, those children, all of them, who it uses as a reference to make a generic kid face, are all being abused because it’s literally using their likeness to make CSAM. That’s quite obvious.

                  It would be different if the AI was seeding models from cartoons, but it’s from real images.

              • Cryophilia@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                8 days ago

                It’s fucking AI, the face is actually like 3 days old because it is NOT A REAL PERSON’S FACE.

                • LustyArgonian@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  8 days ago

                  We aren’t even arguing about this, you giant creep who ALWAYS HAS TO GO TO BAT FOR THIS TOPIC REPEATEDLY.

                  It’s meant to LOOK LIKE a 14 yr old because it is SEEDED OFF 14 YR OLDS so it’s indeed CHILD PORN that is EASILY ACCESSED ON GOOGLE per the original commenter claim that people have to be going to dark places to see this - NO, it’s literally in nearly ALL AI TOP SEARCHES. And it indeed counts for LEGAL PURPOSES in MOST STATES as child porn even if drawn or created with AI. How many porn AI models look like Scarlett Johansson because they are SEEDED WITH VER FACE. Now imagine who the CHILD MODELS are seeding from

                  You’re one of the people I am talking about when I say Lemmy has a lot of creepy pedos on it FYI to all the readers, look at their history

        • swelter_spark@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          I don’t know about that.

          I spot most of it while looking for out-of-print books about growing orchids on the typical file-sharing networks. The term “blue orchid” seems to be frequently used in file names of things that are in no way related to gardening. The eMule network is especially bad.

          When I was looking into messaging clients a couple years ago, to figure out what I wanted to use, I checked out a public user directory for the Tox messaging network and it was maybe 90% people openly trying to find, or offering, custom made CP. On the open internet, not an onion page or anything.

          Then maybe last year, I joined openSUSE’s official Matrix channels, and some random person (who, to be clear, did not seem connected to the distro) invited me to join a room called openSUSE Child Porn, with a room logo that appeared to be an actual photo of a small girl being violated by a grown man.

          I hope to god these are all cops, because I have no idea how there can be so many pedos just openly doing their thing without being caught.

          • Cryophilia@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 days ago

            typical file-sharing networks

            Tox messaging network

            Matrix channels

            I would consider all of these to be trawling dark waters.

            • Schadrach@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 days ago

              …and most of the people who agree with that notion would also consider reading Lemmy to be “trawling dark waters” because it’s not a major site run by a massive corporation actively working to maintain advertiser friendliness to maximize profits. Hell, Matrix is practically Lemmy-adjacent in terms of the tech.

              • Cryophilia@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                8 days ago

                This ain’t the early 2000s. The unwashed masses have found the internet, and it has been cleaned for them. 97% of the internet has no idea what Matrix channels even are.

        • veeloth@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          not stumbled upon it but I’ve met a couple people offering it on mostly normal discord servers

    • deegeese@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      Illegal business can operate online for a long time if they have good OpSec. Anonymous payment systems are much easier these days because of cryptocurrencies.

    • x00z@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 days ago

      It’s a side effect of privacy and security. The one side effect they’re trying to use to undermine all of the privacy and security.

      • TheProtagonist@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        This has nothing to do with privacy! Criminals have their techniques and methods to protect themselves and their “businesses” from discovery, both in the real world and in the online world. Even in a complete absence of privacy they would find a way to hide their stuff from the police - at least for a while.

        In the real world, criminals (e.g. drug dealers) also use cars, so you could argue, that druck trafficking is a side effect of people having cars…

        • x00z@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          This platform used Tor. And because we want to protect privacy, they can make use of it.

        • Cethin@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          Well, it does have to do with privacy and security, it just doesn’t matter if it’s legal or not for them. These people (in the US) always make a point that criminals will buy guns whether it’s legal or not, but then they’ll argue they need to destroy privacy because criminals are using it. It doesn’t make sense, but it doesn’t need to because honesty or consistency aren’t important.

    • sleen@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 days ago

      With the amount of sites that are easily accessed on the dark net though the hidden wiki and other sites. This might of been a honeypot from the start.

      On the contrary, why would they announce that they seized the site? To cause more panic, and to exaggerate the actual situation?

      In addition, that last point should be considered because even if they used these type of operations, honeypotting would still be considered illegal. So Ultimately what is stopping the supreme power to abuse that power on other people?

      • quack@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        No judge would authorise a honeypot that runs for multiple years, hosting original child abuse material meaning that children are actively being abused to produce content for it. That would be an unspeakable atrocity. A few years ago the Australian police seized a similar website and ran it for a matter of weeks to gather intelligence which undoubtedly protected far more children than it harmed and even that was considered too far for many.

        • sleen@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          “That would be an unspeakable atrocity”, yet there is contradiction in the final sentence. The issue is, what evidence is there to prove such thing operation actually works, as my last point implied - what stops the government from abusing this sort of operation. With “covert” operations like this the outcome can be catastrophic for everyone.

      • Geetnerd@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        9 days ago

        Epstein was very smart, and figured out early on there were many, many rich pedophiles.

        So, he got buddy buddy with them, supplied young girls to them.

        BUT, he filmed the encounters in secret, and blackmailed the shit out of these people.

        He was smart enough to become obscenely rich on Wall Street legitimately, but he liked to bang little girls, found others who did too, and then extorted them.

        There’s an anecdote about how when Epstein was holding court with other Aristos, they would bring up any random subject, to get his opinion.

        What would he say? “What does that have to do with pussy?”

        Many, many people have verified that. But because we was filthy rich, everyone just laughed, and blew it off.

        Epstein was murdered. I’m not a conspiracy nut. It’s just blatantly obvious. The 2 guards on duty admitted to fucking off (bribed,) and were aquitted.

        https://www.nbcnews.com/news/us-news/case-dropped-jail-guards-duty-night-epstein-died-rcna10557

        • Ledericas@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          trump was his most frequent guest, and he trump had his goon do everything in his power to get rid of the evidence when he was still alive. alot of politicians of different countries are part of it, as are hollywood execs, weinstein was probably the most infamous one.

        • surewhynotlem@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          Context is important I guess. So two things.

          Is something illegal if it’s not prosecuted?

          Is it CSA if the kid is 9 but that’s marrying age in that country?

          If you answer yes, then no, then we’ll not agree on this topic.

          • taladar@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            9 days ago

            I am not talking about CSA, I am talking about video material of CSA. Most countries with marriage ages that low have much more wide-spread bans on videos including sex of any kind.

            As for prosecution, yes, it is still illegal if it is not prosecuted. There are many reasons not to prosecute something ranging all the way from resource and other means related concerns to intentionally turning a blind eye and only a small minority of them would lead that country to actively sabotage a major international investigation, especially after the trade-offs are considered (such as loss of international reputation by refusing to cooperate).

        • Rob1992@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          9 days ago

          Pick any country where child marriage is legal and where women are a object the man owns

    • prole@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      with a catchy name clearly thought up by a marketing person

      A marketing person? They took “Netflix” and changed the first three letters lol

      • imetators@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        9 days ago

        Exactly! There are plethora of *flix sites out there including adult ones. It does not take much of marketing skill to name site like this.

  • SpiceDealer@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    Massive congratulations to Europol and its partners in taking this shit down and putting these perverts away. However, they shouldn’t rest on their laurels. The objective now is to ensure that the distribution of this disgusting material is stopped outright and that no further children are harmed.

    • FauxLiving@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      The objective now is to ensure that the distribution of this disgusting material is stopped outright and that no further children are harmed.

      Sure, it’ll only cost you every bit of your privacy as governments make illegal and eliminate any means for people to communicate without the eye of Big Brother watching.

      Every anti-privacy measure that governments put forward is always like "We need to be able to track your location in real time, read all of your text messages and see every picture that your phone ever takes so that we can catch the .001% of people who are child predators. Look at how scary they are!

      Why are you arguing against these anti-pedophile laws?! You don’t support child sex predators do you?!"

  • Doctor_Satan@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Goddam what an obvious fucking name. If you wrote a procedural cop show where the child traffickers ran a site called KidFlix, you’d be laughed out of the building for being so on-the-nose.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 days ago

      Depends on your taste for stories and the general atmosphere. I think in better parts of Star Wars EU this would make sense (or it wouldn’t, but the right way, same as in reality).

    • Eugene V. Debs' Ghost@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      “See we caught these guys without doing it, thank of how many more we can catch if we do! Like all the terrorists America has caught with violating their privacy. …Maybe some day they will.”

    • Ronno@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      Basically the only reason I read the article is to know if they needed a “backdoor” in encryption, guess the don’t need it, like everyone with a little bit of IT knowledge always told them.