ML
    • Recent
    • Categories
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Apple plans to scan your images for child porn

    Scheduled Pinned Locked Moved News
    168 Posts 16 Posters 18.2k Views
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • scottalanmillerS
      scottalanmiller
      last edited by

      https://appleinsider.com/articles/21/08/07/epic-games-ceo-slams-apple-government-spyware

      1 Reply Last reply Reply Quote 0
      • stacksofplatesS
        stacksofplates @DustinB3403
        last edited by

        @dustinb3403 said in Apple plans to scan your images for child porn:

        @stacksofplates said in Apple plans to scan your images for child porn:

        @dustinb3403 said in Apple plans to scan your images for child porn:

        @stacksofplates said in Apple plans to scan your images for child porn:

        The scan results would have to include the photo.

        Actually no, the scans on-device create a hash record (MD5 or SHA256 probably) and then are compared against a known database of CSAM.

        Anything that matches would start sending up red flags.

        The actual photo may never get uploaded to iCloud.

        That's a joke right? You didn't read the article. They're using a neutral network to compare an image to a database of checksummed images. Presumably by features like face, exif data, etc. Then a human verifies it's a match to content in the existing checksummed image.

        A 4 year old could get around comparing two images by checksum. That's clearly not what's happening here. Just change a single pixel and it's different. You don't need a neural net to compare checksums.

        By the explanation in the article, they have to have the photo to compare.

        Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.

        The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes

        The AI is running on device. Not sure where you read it's not. It's the same on device AI they are using for the iMessage sexually explicit verification.

        1 Reply Last reply Reply Quote 0
        • stacksofplatesS
          stacksofplates @DustinB3403
          last edited by stacksofplates

          Wrong, the on-device code is creating a hash, and that hash recording is getting compared. Read the announcement again from Apple.

          The machine learning comparison doesn't come in until the image is in iCloud. That's where the comparison happens, and then if a threshold is hit a human compares the images/hashes

          The official statement doesn't even mention AI/neural in any way. Here's from their technical paper:

          NeuralHash
          NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this
          number on features of the image instead of the precise values of pixels in the image. The system computes
          these hashes by using an embedding network to produce image descriptors and then converting those
          descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process
          ensures that different images produce different hashes.

          Before an image is stored in iCloud Photos, the following on-device matching process is performed for that
          image against the blinded hash table database. The device computes the image NeuralHash and looks up
          the entry in the blinded hash table at the position pointed by the NeuralHash. The device uses the
          computed NeuralHash to compute a cryptographic header. It also uses the blinded hash that the system
          looked up to obtain a derived encryption key. This encryption key is then used to encrypt the associated
          payload data.

          The AI is running on the phone and doing image verification based on features, not just a checksum.

          Also it's eavesdrop.

          1 Reply Last reply Reply Quote 0
          • stacksofplatesS
            stacksofplates
            last edited by

            Also, if you look at their diagram in their white paper, the photo is part of the safety voucher, which is what is uploaded to iCloud.

            ff468715-1dc9-4afe-a954-cb86a7fe1eb0-image.png

            So this is what I was getting at earlier.

            This voucher is uploaded to iCloud Photos along with the image.

            Is that separate from icloud backup or is the voucher sent along with the image when it's backed up? By their process description the photo has to be sent as well, because they can't verify other-wards.

            This is why it's not straightforward and why I think @Carnival-Boy was making those statements.

            ObsolesceO 1 Reply Last reply Reply Quote 1
            • ObsolesceO
              Obsolesce
              last edited by

              To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.

              Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.

              scottalanmillerS 1 Reply Last reply Reply Quote 0
              • scottalanmillerS
                scottalanmiller
                last edited by

                Looks like Apple has gone fully off of the rails. Full on spying on their workers at home...

                https://forums.macrumors.com/threads/apple-responds-to-call-center-worker-complaints-about-plans-to-monitor-them-with-cameras-at-home.2306989/

                DashrenderD 1 Reply Last reply Reply Quote 0
                • scottalanmillerS
                  scottalanmiller @Obsolesce
                  last edited by

                  @obsolesce said in Apple plans to scan your images for child porn:

                  To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.

                  Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.

                  The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.

                  ObsolesceO 2 Replies Last reply Reply Quote 0
                  • DashrenderD
                    Dashrender @scottalanmiller
                    last edited by

                    @scottalanmiller said in Apple plans to scan your images for child porn:

                    Looks like Apple has gone fully off of the rails. Full on spying on their workers at home...

                    https://forums.macrumors.com/threads/apple-responds-to-call-center-worker-complaints-about-plans-to-monitor-them-with-cameras-at-home.2306989/

                    Not surprised - this is the exact mentality that some had around here... If I can't see you working/not working, then I assume you're not working. /sigh.

                    1 Reply Last reply Reply Quote 0
                    • ObsolesceO
                      Obsolesce @scottalanmiller
                      last edited by Obsolesce

                      @scottalanmiller said in Apple plans to scan your images for child porn:

                      The issue is that they are scanning without the upload, and can based on that be forced to report on you.

                      It seems like this whole thing is only put into action via a trigger, which is the uploading of the image to your iCloud account. Before then, there's no way for any reporting. The reporting is done via iCloud servers based on the image voucher that's with your image in iCloud.

                      d706e1e7-dcc0-4782-a78f-42190337cfbe-image.png

                      This whole thing is all about the iCloud Photos account accumulating enough CSAM matching vouchers. Without your photos being uploaded to your iCloud Photos account, this whole thing is moot.

                      For reference: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

                      1 Reply Last reply Reply Quote 0
                      • ITivan80I
                        ITivan80
                        last edited by

                        This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.

                        1 JaredBuschJ scottalanmillerS 3 Replies Last reply Reply Quote 2
                        • 1
                          1337 @ITivan80
                          last edited by 1337

                          @itivan80 said in Apple plans to scan your images for child porn:

                          This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.

                          I agree. And also why not scan for every other type of potential crime? Drug trafficking, terrorism, murder, war crimes etc.

                          Maybe apple should record all your conversations with the built-in mic in your device to keep track of what you're saying - without you knowing and without a warrant of course. Oh, I forgot, they were already caught doing that with Siri two years ago.

                          DashrenderD 1 Reply Last reply Reply Quote 2
                          • DashrenderD
                            Dashrender @1337
                            last edited by

                            @pete-s said in Apple plans to scan your images for child porn:

                            @itivan80 said in Apple plans to scan your images for child porn:

                            This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.

                            I agree. And also why not scan for every other type of potential crime? Drug trafficking, terrorism, murder, war crimes etc.

                            Maybe apple should record all your conversations with the built-in mic in your device to keep track of what you're saying - without you knowing and without a warrant of course. Oh, I forgot, they were already caught doing that with Siri two years ago.

                            This is Scott's entire point...

                            And this child porn is literally just the foot in the door - tomorrow they WILL be searching for those things because warrants will make them.

                            1 Reply Last reply Reply Quote 4
                            • JaredBuschJ
                              JaredBusch @ITivan80
                              last edited by

                              @itivan80 said in Apple plans to scan your images for child porn:

                              This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.

                              No, because without the backdoor breaking the chain of encryption, nothing is there for a warrant.....

                              1 Reply Last reply Reply Quote 3
                              • scottalanmillerS
                                scottalanmiller @ITivan80
                                last edited by

                                @itivan80 said in Apple plans to scan your images for child porn:

                                This is something law enforcement should be doing with a court ordered warrant not apple. Talk about big tech super reach. I get it they want to stop it before it happens but they are not the police.

                                Well in a way, they are actually now private unlicensed police.

                                1 Reply Last reply Reply Quote 0
                                • scottalanmillerS
                                  scottalanmiller
                                  last edited by

                                  https://www.reuters.com/technology/exclusive-apples-child-protection-features-spark-concern-within-its-own-ranks-2021-08-12/

                                  1 Reply Last reply Reply Quote 0
                                  • ObsolesceO
                                    Obsolesce @stacksofplates
                                    last edited by

                                    @stacksofplates said in Apple plans to scan your images for child porn:

                                    Also, if you look at their diagram in their white paper, the photo is part of the safety voucher, which is what is uploaded to iCloud.

                                    ff468715-1dc9-4afe-a954-cb86a7fe1eb0-image.png

                                    So this is what I was getting at earlier.

                                    This voucher is uploaded to iCloud Photos along with the image.

                                    Is that separate from icloud backup or is the voucher sent along with the image when it's backed up? By their process description the photo has to be sent as well, because they can't verify other-wards.

                                    This is why it's not straightforward and why I think @Carnival-Boy was making those statements.

                                    It looks like it's only a single package that goes to iCloud. Either you choose to back up your photo to iCloud and in that case it's packaged with the voucher..... or NOTHING happens at all. The photo is not sent to iCloud and no scanning or anything happens.

                                    1 Reply Last reply Reply Quote 0
                                    • ObsolesceO
                                      Obsolesce @scottalanmiller
                                      last edited by

                                      @scottalanmiller said in Apple plans to scan your images for child porn:

                                      @obsolesce said in Apple plans to scan your images for child porn:

                                      To me it's clear from the white paper that if you don't upload images to icloud, then this doesn't work. However, since I'm not an iPhone user, I don't know if you have any control over whether or not the photos stored on your phone are uploaded to your icloud account.

                                      Basically, when images are uploaded to your icloud account, by your doing or automatically, your phone first does this hashing magic with csam results and then packages it along with your photo that is stored in your icloud account. At that point, the icloud scanners simply read the results that are packaged with your photo. I think of it like the photo having an attached health certificate that icloud scanner can pick up.

                                      The issue is that they are scanning without the upload, and can based on that be forced to report on you. The upload limitation is an option on their end that they can't enforce. So it's moot.

                                      Another confirmation:

                                      78b271fb-20c9-4be7-8e1e-fe2c100d7a61-image.png

                                      JaredBuschJ scottalanmillerS 2 Replies Last reply Reply Quote 0
                                      • JaredBuschJ
                                        JaredBusch @Obsolesce
                                        last edited by

                                        @obsolesce if it was only on upload why put the signatures on every phone.

                                        ObsolesceO 1 Reply Last reply Reply Quote 1
                                        • ObsolesceO
                                          Obsolesce @JaredBusch
                                          last edited by Obsolesce

                                          @jaredbusch said in Apple plans to scan your images for child porn:

                                          @obsolesce if it was only on upload why put the signatures on every phone.

                                          They need to be on the device if photos are going to be uploaded to iCloud Photos since I assume photo uploads are the behavior. I'm sure there are tons of things on these devices that aren't ever used. I doubt the signatures being on the phone matters if it's not used, that'll likely be the minority.

                                          scottalanmillerS 1 Reply Last reply Reply Quote 0
                                          • scottalanmillerS
                                            scottalanmiller @Obsolesce
                                            last edited by

                                            @obsolesce said in Apple plans to scan your images for child porn:

                                            I doubt the signatures being on the phone matters if it's not used, that'll likely be the minority.

                                            They are a problem because they can be used. The gov't can force the upload with a warrant. So the existence of the capability is the exposure itself.

                                            It's true, semantically we can say that there is no risk without an upload (so a phone that is dead, for example, is not affected.) But as the gov't or Apple as an organization has the power to enact the upload at will the problem exists before that point.

                                            Similarly we could say that the upload doesn't matter because we aren't exposed until the data from the upload is handed to the gov't.

                                            Or we could say that the gov't stealing our data doesn't matter until they use it maliciously.

                                            Or we could say that being arrested and taken to court for pictures that Apple claims to have found on our phones, or the police claim to have gotten from Apple, doesn't matter until we end up in jail for something we didn't do.

                                            Any step taking away our freedom of speech, any step threatening journalists or minorities can be seen as irrelevant until it is fully used to hurt someone. But that's not the case. The power and opportunity itself is a threat and threats are primarily what stop freedoms.

                                            ObsolesceO 1 Reply Last reply Reply Quote 0
                                            • 1
                                            • 2
                                            • 3
                                            • 4
                                            • 5
                                            • 8
                                            • 9
                                            • 2 / 9
                                            • First post
                                              Last post