• BonesOfTheMoon@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    13
    ·
    7 days ago

    As an aside, a great deal of CSAM is shared through Facebook, they’ve been asked by CSAM survivors to stop this and they said no. The advocacy survivor group Phoenix 11 submitted six formal questions in the US Congress to old Zuckface fuckface about it, as he deployed end to end encryption which makes this possible, which he dodged like the lying fuck he is. Zuck would sell it himself if it made him a whole dollar and nobody should forget that.

    • Credibly_Human@lemmy.world
      link
      fedilink
      English
      arrow-up
      37
      ·
      7 days ago

      I completely disagree with your take here.

      The idea that we shouldn’t have services with end to end encryption because “think of the children” is an absurd take to have.

      I think you’re being upvoted because its anti Zuckerberg, but seriously people, think about the long term consequences of not being able to chat without being spied on.

      • Øπ3ŕ@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        7 days ago

        Yeah, that argument is shit wrapped in a candy shell. Too many people’ll swallow it without so much as a blink. 🥲

        • sleen@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          7 days ago

          A case of indoctrination by the government, that’s what they expect - a stream of false narratives utilising children as the weapon.

    • TeddE@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 days ago

      Which is frustrating as every single time the government wanted to expand surveillance on regular citizens, preventing this is always touted as the justification. Now we have all the surveillance and the monopolies are like ‘Nah, exploitation is profitable’.

  • Asafum@feddit.nl
    link
    fedilink
    English
    arrow-up
    98
    ·
    7 days ago

    “So just to get this straight, you’re saying you have downloaded 152 … zettabytes … of porn for your own personal use?”

  • MachineFab812@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    94
    arrow-down
    2
    ·
    edit-2
    7 days ago

    Where did they get the idea that that’s a more respectable response?

    EDIT: Doesn’t/shouldn’t work for their liability either. Vocabulary fail on my part.

    • FishFace@piefed.social
      link
      fedilink
      English
      arrow-up
      19
      ·
      7 days ago

      They didn’t claim it was respectable, they claimed it made them not liable? Where’d you get this idea?

        • FishFace@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          Why should a company be legally responsible for copyright infringement of its employees, if it wasn’t something they did for work?

          • MachineFab812@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 days ago

            If you or I can be held responsible for such activities from our homes, why give google an exemption?

            It would depend on jurisdiction of course, many of us live places that will give us(with help of a lawyer…) a bit of an out for guest wifi or TOR exit nodes, but ultimately, you know google is going to settle for little more(or less) than it would have cost them to buy these works at retail, whereas you or I would also get slapped with thousands of dollars extra(per item?) in fines and legal fees.

            They can afford to pay for the porn, but they chose to go the “we shouldn’t have to because its smut” route, and not bother trying to say their employees are responsible for downloading random books/movies/whatever for personal use. Do they get to use this out for CP?

            Also, unlike you or I, they have logging in place, such that they know which employees did what. Not saying they should name-and-shame, but they could(and should) easilly eat the cost and pass it through to those employees, whether it also comes with HR disciplinary action ornot.

    • explodicle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 days ago

      I think it’s more respectable to provide unfiltered internet than it is to profit off someone else’s work without paying them.

    • Steve Dice@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      6 days ago

      You’re joking but it was 2400 movies over 7 years downloaded individually and not in bulk like they blatantly did with books. Apparently over 60% of adults admit to have viewed porn at work so yeah… someone should probably check on the engineers in the goon cave.

    • NateNate60@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      arrow-down
      1
      ·
      8 days ago

      I would not be surprised if Meta advertised such a thing to prospective employees as a legitimate benefit of the job. A built-in VR goon cave with 30 TB of material available. Limit 1 hour per person, bookings required 6 months in advance. Sessions subject to monitoring for security and training purposes. May contain trace amounts of Zuck.

      • auraithx@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        14
        ·
        7 days ago

        Wouldn’t need any material. Hooks up to your fb and you can pick which of your pals to hook up to the milking table.

      • FishFace@piefed.social
        link
        fedilink
        English
        arrow-up
        9
        ·
        7 days ago

        Then you should be less credulous. What is told to prospective employees is effectively public information.

  • Ignotum@lemmy.world
    link
    fedilink
    English
    arrow-up
    61
    ·
    7 days ago

    “hey steve, did you download a shitton of porn while on the company network?”
    “Uuuhhhhh, it’s for ai training”

  • RedFrank24@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    ·
    7 days ago

    Torrent the Dark Knight to watch at home along and the media companies will sue you for infinity billion dollars. Openly torrent every movie known to man to train an AI and the media companies don’t do shit.

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    34
    ·
    7 days ago

    An even more “glaring” defect, Meta argued, is that Meta’s terms prohibit generating adult content, “contradicting the premise that such materials might even be useful for Meta’s AI training.”

    Oh yes, this is true because meta, or any other company for that matter, has never ever in the history of ever changed its terms of service…

  • Lodespawn@aussie.zone
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    8 days ago

    I can only assume the MPAA will funnel vast sums of money into helping prosecute these thieves? Any minute now right?

  • Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    7 days ago

    One wonders how much child porn was in there…

    But it’s AI, so itsa aaaaalllll fine

    • manuallybreathing@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 days ago

      It’s better to say child sex abuse material (csam), the term “child porn” both legitimizes the conent, and infers children could ever be active and consenting participants

      sexualised content involving children is abuse and should be labled as so

    • Evotech@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      7 days ago

      It’s kinda weird how spwsific you have to be with certain models to not make them vwry very young looking people