Skip to content
 
If you can read this, either the style sheet didn't load or you have an older browser that doesn't support style sheets. Try clearing your browser cache and refreshing the page.

(The Register)   Apple decides that maybe forcing everyone to stop using Apple products is a bad idea. Maybe   (theregister.com) divider line
    More: Awkward, Apple Inc., Security, National security, Computer security, App Store, security experts, government surveillance, customers' own devices  
•       •       •

2614 clicks; posted to Business » and STEM » on 04 Sep 2021 at 4:25 AM (37 weeks ago)   |   Favorite    |   share:  Share on Twitter share via Email Share on Facebook



56 Comments     (+0 »)
View Voting Results: Smartest and Funniest


Oldest | « | 1 | 2 | » | Newest | Show all

 
2021-09-04 4:44:57 AM  
I feel like this is something they have already implemented and will continue to use despite what they say
 
2021-09-04 5:09:21 AM  
Apple doesn't sell products.  They sell a lifestyle brand.
 
2021-09-04 5:24:14 AM  
Is this where Android and Apple fans shiat on the "opposing" product?
Ffs it's a phone. Use whatever works for you and makes you happy.
 
2021-09-04 6:20:07 AM  
If Google did the same thing and saw the state of some of the wives and girlfriends saucy pics people have on their phones it would probably just result in those girls getting targeted ads for diet pills and fungal cream.  The main reason Apple are pausing this is that the hashtags can be easily fooled into giving far too many false positives and all the human manpower it would take just to check them.
I would applaud Apple funding an interagency sting on the sources of CP and hundreds of other harmful porn genres. Prosecuting and closing down the download sites that exist to spread such crap. If that was their main concern it's where they would have gone first,
 
2021-09-04 6:29:57 AM  

omg bbq: Is this where Android and Apple fans shiat on the "opposing" product?
Ffs it's a phone. Use whatever works for you and makes you happy.


Did you even read tfa?
 
2021-09-04 6:39:13 AM  

Abe Vigoda's Ghost: omg bbq: Is this where Android and Apple fans shiat on the "opposing" product?
Ffs it's a phone. Use whatever works for you and makes you happy.

Did you even read tfa?


Been here 16 years and asks a question like that. I had no idea they were selling fark accounts now.

Fark user imageView Full Size
 
2021-09-04 6:55:38 AM  
Because most folks don't want to hire a lawyer to explain to prosecutors, why the photo I took of my baby in the bath tub, is not child pron. I have never had to explain my way out of a charge like that, but it sounds like the worst uphill battle of a lifetime.
 
2021-09-04 8:38:42 AM  

Lambskincoat: Because most folks don't want to hire a lawyer to explain to prosecutors, why the photo I took of my baby in the bath tub, is not child pron. I have never had to explain my way out of a charge like that, but it sounds like the worst uphill battle of a lifetime.


You've taken pictures of your naked baby in a bathtub? You sick fark!
 
2021-09-04 9:06:51 AM  

Lambskincoat: Because most folks don't want to hire a lawyer to explain to prosecutors, why the photo I took of my baby in the bath tub, is not child pron. I have never had to explain my way out of a charge like that, but it sounds like the worst uphill battle of a lifetime.


pbfcomics.comView Full Size
 
2021-09-04 11:08:46 AM  
If I owned an Apple product I'd just take hundreds of dick pics a a day and fill the cloud with my cock.
 
2021-09-04 12:49:30 PM  
The backgrounders I've read say they are among the last to implement this, so fine go to a competitor where it's already implemented.  The messaging should have started with this, but they didn't want to sound late to the party.  People would be outraged that they didn't do this soon enough.
 
2021-09-04 12:51:59 PM  

styckx: If I owned an Apple product I'd just take hundreds of dick pics a a day and fill the cloud with my cock.


I mean, you can do that anyway.
 
2021-09-04 1:10:21 PM  

recondite cetacean: The backgrounders I've read say they are among the last to implement this, so fine go to a competitor where it's already implemented.  The messaging should have started with this, but they didn't want to sound late to the party.  People would be outraged that they didn't do this soon enough.


The difference is that other providers scan your photos as they are uploaded to their servers. Nobody really has that much of a problem with that.

What Apple wants to do is have your own device do the scanning before uploading to iCloud. That's what has people up in arms.
 
2021-09-04 1:35:13 PM  

trialpha: recondite cetacean: The backgrounders I've read say they are among the last to implement this, so fine go to a competitor where it's already implemented.  The messaging should have started with this, but they didn't want to sound late to the party.  People would be outraged that they didn't do this soon enough.

The difference is that other providers scan your photos as they are uploaded to their servers. Nobody really has that much of a problem with that.

What Apple wants to do is have your own device do the scanning before uploading to iCloud. That's what has people up in arms.


But if Apple is too be believed, iCloud is designed and secured in such a way that they couldn't scan it server side. If true, that means they're the only ones where they don't have a raw, unencrypted copy of every image uploaded.

My side of the fence is too put the effort in preventing new or ongoing child abuse or exploitation.

For instance, if Apple truly gave a crap, they'd implement this in Safari to prevent displaying and downloading known images in the first place across all their devices. Why only check local images bound for the cloud?
 
2021-09-04 1:51:48 PM  

Quantumbunny: trialpha: recondite cetacean: The backgrounders I've read say they are among the last to implement this, so fine go to a competitor where it's already implemented.  The messaging should have started with this, but they didn't want to sound late to the party.  People would be outraged that they didn't do this soon enough.

The difference is that other providers scan your photos as they are uploaded to their servers. Nobody really has that much of a problem with that.

What Apple wants to do is have your own device do the scanning before uploading to iCloud. That's what has people up in arms.

But if Apple is too be believed, iCloud is designed and secured in such a way that they couldn't scan it server side. If true, that means they're the only ones where they don't have a raw, unencrypted copy of every image uploaded.

My side of the fence is too put the effort in preventing new or ongoing child abuse or exploitation.

For instance, if Apple truly gave a crap, they'd implement this in Safari to prevent displaying and downloading known images in the first place across all their devices. Why only check local images bound for the cloud?


They don't want to prevent the images on their cloud, or their display in Safari. The primary goal was to ID the people who are buying, selling, and sharing child pornography. Then they hand that evidence over to the authorities. The problem is that it's a massive invasion of privacy and illegal searching on a grand scale. We already know that's happening anyway but Apple doesn't need to join forces with them.
 
2021-09-04 1:53:52 PM  

Quantumbunny: But if Apple is too be believed, iCloud is designed and secured in such a way that they couldn't scan it server side. If true, that means they're the only ones where they don't have a raw, unencrypted copy of every image uploaded.


iCloud apparently doesn't use end to end encryption. So they're perfectly capable of scanning things as they are uploaded.

Quantumbunny: For instance, if Apple truly gave a crap, they'd implement this in Safari to prevent displaying and downloading known images in the first place across all their devices. Why only check local images bound for the cloud?


This would create even more of an uproar. If they were detecting those images as they were downloaded, then it could be argued (if it isn't already a requirement) that they should be reporting the user to the authorities.

Your last question actually raises the problem with doing this to begin with. Once you go down this path, it's just a few steps to a full big brother phone, watching everything you do and reporting you for whatever the authorities don't like.
 
2021-09-04 2:05:38 PM  

styckx: If I owned an Apple product I'd just take hundreds of dick pics a a day and fill the cloud with my cock.


"Filling the cloud with my cock" would be a great lyric.
 
2021-09-04 2:32:55 PM  

gorauma: I feel like this is something they have already implemented and will continue to use despite what they say


And others. Amazon is already purging content from its Web Services platform. Right now it's for "violence" but other banned categories are sure to follow. It's not hard to predict that eventually many things it deems repugnant will follow, such as anti-abortion, guns, saying critical race theory is itself racist, anything that would upset the Chinese government ...
 
2021-09-04 2:36:03 PM  

trialpha: recondite cetacean: The backgrounders I've read say they are among the last to implement this, so fine go to a competitor where it's already implemented.  The messaging should have started with this, but they didn't want to sound late to the party.  People would be outraged that they didn't do this soon enough.

The difference is that other providers scan your photos as they are uploaded to their servers. Nobody really has that much of a problem with that.

What Apple wants to do is have your own device do the scanning before uploading to iCloud. That's what has people up in arms.


I've not read that, but what's the difference between scanning before and after?
 
2021-09-04 2:40:22 PM  
If you have At&t, buy your apple crap from the At&t store instead of the At&t website. Went with my daughter for her  b-day yesterday and her Apple phone thing was $300 cheaper than online.
 
2021-09-04 2:43:19 PM  

trialpha: Quantumbunny: But if Apple is too be believed, iCloud is designed and secured in such a way that they couldn't scan it server side. If true, that means they're the only ones where they don't have a raw, unencrypted copy of every image uploaded.

iCloud apparently doesn't use end to end encryption. So they're perfectly capable of scanning things as they are uploaded.

Quantumbunny: For instance, if Apple truly gave a crap, they'd implement this in Safari to prevent displaying and downloading known images in the first place across all their devices. Why only check local images bound for the cloud?

This would create even more of an uproar. If they were detecting those images as they were downloaded, then it could be argued (if it isn't already a requirement) that they should be reporting the user to the authorities.

Your last question actually raises the problem with doing this to begin with. Once you go down this path, it's just a few steps to a full big brother phone, watching everything you do and reporting you for whatever the authorities don't like.


Apple is probably the most hypocritical company on Earth. Your device comes with a browser that blocks no porn. But an App with something like a NSFW toggle is banned (check out all the Reddit Apps, for instance).

Let's scan images on users phone, but not worry about how they got there.

There's a lot of problems with hash checks technically, there's even more in his you acquire side lists, and how easy they are too bypass. There's a lot of questionably moral or smarmy things in what they're proposing... which isn't even the best place to do it (either endpoint is a much better option).

When they make these kinds of stances, I definitely question the actual goal.
 
2021-09-04 2:43:44 PM  
And more specifically, my problem is the headline.  Is there a statistically significant number of people who are going to change their purchasing habits because of this?  Do people buy iPhones because they *don't* check for CSAM?  Will they change providers thinking others don't check for CSAM?

Apple isn't making this decision for any reason other than they botched the initial explanation.  It will be back with proper messaging.  No one won anything here, you're still trusting your privacy to people who don't give a crap about it, and that will continue to get worse.
 
2021-09-04 3:02:13 PM  

omg bbq: Abe Vigoda's Ghost: omg bbq: Is this where Android and Apple fans shiat on the "opposing" product?
Ffs it's a phone. Use whatever works for you and makes you happy.

Did you even read tfa?

Been here 16 years and asks a question like that. I had no idea they were selling fark accounts now.

[Fark user image image 425x123]


You've been here 16 years, yet you weren't aware that accounts were being sold...? 🤔
 
2021-09-04 3:02:48 PM  
I'm just going to wikipedia this shiat because I don't understand it.  We have a slippery slope fallacy, we have human review before referral which removes false positives, and it seems that other services do this, and Apple is only targeting things destined for iCloud, which is functionally the same as being scanned on the recipient side.  There's no difference between this and other things that are in place right now.

Your original photo of your kids in the bath isn't going to match a NCMEC hash.  If by some reason your photo of your stupidly overcooked steak matches a known hash, a person is going to look at the picture of you being a war criminal and not refer to to the Hague, which by rights they need to but won't, and not refer you to law enforcement.  They should, but won't.

In 2021, the group faced criticism over a partnership with Apple to integrate software into iOS 15 which will scan iCloud photos to be uploaded as part of iCloud Photo Library for known child pornography. Once known matches are found, the content will be sent to Apple for review and after human review, user data is forwarded to NCMEC for law enforcement review.[36] Critics charged that the software was an unreasonable encroachment on privacy. Edward Snowden described the updated devices as "iNarcs", while the Electronic Frontier Foundation argued that such a backdoor would require little change to expand to look for additional types of content, and that some governments could require Apple to enable such features.[37][38] An editorial in The New York Times by Matthew D. Green and Alex Stamos observed that while many platforms like Facebook, Google, and Microsoft have long screened user uploads to their platforms for abusive material, Apple's promise to only evaluate photos which use its iCloud service is a policy decision, not a technological requirement limiting access to users personal devices.[39] In a company-wide internal letter to Apple employees in response to public backlash against the measure, NCMEC's executive director of strategic partnerships Marita Rodriguez described the criticism as the "screeching voices of the minority."[40]

I hate all kinds of oversight, and your should be secure in [your] persons, houses, papers, and effects.  But this seems to be a bunch of nothing.  If I have an iPhone, and an iCloud, and I upload to the iCloud, I'm no longer in control of my content.  The agreement is with the provider and the provider has said you are not secure.  I have copyright claims, not privacy claims.
 
2021-09-04 3:03:11 PM  

styckx: If I owned an Apple product I'd just take hundreds of dick pics a a day and fill the cloud with my cock.


Hey, you! Get off in my cloud!
 
2021-09-04 3:36:49 PM  

styckx: If I owned an Apple product I'd just take hundreds of dick pics a a day and fill the cloud with my cock.


That's what I do. Don't all Apple users do it?
 
2021-09-04 3:41:56 PM  

trialpha: What Apple wants to do is have your own device


Have your own device do it in a manner where they can't see the results until 30 images are flagged as kiddie porn?

Surely, Google's method of scanning everything on their own servers is more private?

Like the way that Google scans all your photos to identify all the people in them?

That sounds private.
 
2021-09-04 3:49:19 PM  

trialpha: iCloud apparently doesn't use end to end encryption. So they're perfectly capable of scanning things as they are uploaded.


LOL.  Microsoft invented the method to scan everything in your account a decade ago.

The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)

Microsoft and Google have been scanning everything in your account for the past decade.

a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account

The difference between Microsoft, Google, and Apple is that when you perform the scan on the server, instead of on the device, dragnet warrants can be issued that would hand over the account data for everyone who has a false positive match for kiddie porn.

In the same way that dragnet warrants have been issued for the location data that Google hoards on their servers.

It's not like we haven't seen that data hoard used to incriminate innocent people before.

Innocent man, 23, sues Arizona police for $1.5million after being arrested for murder and jailed for six days when Google's GPS tracker wrongly placed him at the scene of the 2018 crime

Not even Apple can see the scan results performed by your device until a 30 matches threshold is reached.
 
2021-09-04 3:59:00 PM  

BullBearMS: The difference between Microsoft, Google, and Apple is that when you perform the scan on the server, instead of on the device, dragnet warrants can be issued that would hand over the account data for everyone who has a false positive match for kiddie porn.


All reporting says that Apple does a hash match according to NCMEC, which isn't just some collision-likely CRC32, and a person reviews those before anything else happens.  False positives should be decreased, and identified before anyone knows about what's on your drive.  There may be gaps in my understanding, but I don't work at Apple so I can't fill in those gaps.
 
2021-09-04 4:15:44 PM  

recondite cetacean: BullBearMS: The difference between Microsoft, Google, and Apple is that when you perform the scan on the server, instead of on the device, dragnet warrants can be issued that would hand over the account data for everyone who has a false positive match for kiddie porn.

All reporting says that Apple does a hash match according to NCMEC, which isn't just some collision-likely CRC32, and a person reviews those before anything else happens.  False positives should be decreased, and identified before anyone knows about what's on your drive.  There may be gaps in my understanding, but I don't work at Apple so I can't fill in those gaps.


Apple has no idea there has been a match for known kiddie porn on your device until you hit a large number of matches, so the privacy of innocent victims of false positives is increased right off the bat.

After you cross the 30 matches threshold, a human review is triggered before anything else happens.

Given Google's reluctance to hire expensive human beings when an inaccurate algorithm is cheaper, I have no doubt they are turning over the account details for everyone with a single false positive.
 
2021-09-04 4:30:23 PM  

Quantumbunny: trialpha: recondite cetacean: The backgrounders I've read say they are among the last to implement this, so fine go to a competitor where it's already implemented.  The messaging should have started with this, but they didn't want to sound late to the party.  People would be outraged that they didn't do this soon enough.

The difference is that other providers scan your photos as they are uploaded to their servers. Nobody really has that much of a problem with that.

What Apple wants to do is have your own device do the scanning before uploading to iCloud. That's what has people up in arms.

But if Apple is too be believed, iCloud is designed and secured in such a way that they couldn't scan it server side. If true, that means they're the only ones where they don't have a raw, unencrypted copy of every image uploaded.

My side of the fence is too put the effort in preventing new or ongoing child abuse or exploitation.

For instance, if Apple truly gave a crap, they'd implement this in Safari to prevent displaying and downloading known images in the first place across all their devices. Why only check local images bound for the cloud?


Too
 
2021-09-04 5:51:03 PM  

BullBearMS: LOL.  Microsoft invented the method


Fark user imageView Full Size
 
2021-09-04 6:02:45 PM  

styckx: If I owned an Apple product I'd just take hundreds of dick pics a a day and fill the cloud with my cock.


Then self-report them all so someone has to look at them.
 
2021-09-04 6:13:15 PM  

austerity101: omg bbq: Abe Vigoda's Ghost: omg bbq: Is this where Android and Apple fans shiat on the "opposing" product?
Ffs it's a phone. Use whatever works for you and makes you happy.

Did you even read tfa?

Been here 16 years and asks a question like that. I had no idea they were selling fark accounts now.

[Fark user image image 425x123]

You've been here 16 years, yet you weren't aware that accounts were being sold...? 🤔


I had no idea people would throw their money away like that.
 
2021-09-04 8:38:40 PM  

BullBearMS: recondite cetacean: BullBearMS: The difference between Microsoft, Google, and Apple is that when you perform the scan on the server, instead of on the device, dragnet warrants can be issued that would hand over the account data for everyone who has a false positive match for kiddie porn.

All reporting says that Apple does a hash match according to NCMEC, which isn't just some collision-likely CRC32, and a person reviews those before anything else happens.  False positives should be decreased, and identified before anyone knows about what's on your drive.  There may be gaps in my understanding, but I don't work at Apple so I can't fill in those gaps.

Apple has no idea there has been a match for known kiddie porn on your device until you hit a large number of matches, so the privacy of innocent victims of false positives is increased right off the bat.

After you cross the 30 matches threshold, a human review is triggered before anything else happens.

Given Google's reluctance to hire expensive human beings when an inaccurate algorithm is cheaper, I have no doubt they are turning over the account details for everyone with a single false positive.


So Apple is better than Google in terms of privacy here?
 
2021-09-04 8:45:42 PM  

Tyrone Slothrop: Lambskincoat: Because most folks don't want to hire a lawyer to explain to prosecutors, why the photo I took of my baby in the bath tub, is not child pron. I have never had to explain my way out of a charge like that, but it sounds like the worst uphill battle of a lifetime.

[pbfcomics.com image 800x1062]


You laugh but I knew someone this happened to.

A Swat raid anyway, after ripping both houses on the property apart all they found was some weed.
 
2021-09-04 9:43:37 PM  

recondite cetacean: BullBearMS: recondite cetacean: BullBearMS: The difference between Microsoft, Google, and Apple is that when you perform the scan on the server, instead of on the device, dragnet warrants can be issued that would hand over the account data for everyone who has a false positive match for kiddie porn.

All reporting says that Apple does a hash match according to NCMEC, which isn't just some collision-likely CRC32, and a person reviews those before anything else happens.  False positives should be decreased, and identified before anyone knows about what's on your drive.  There may be gaps in my understanding, but I don't work at Apple so I can't fill in those gaps.

Apple has no idea there has been a match for known kiddie porn on your device until you hit a large number of matches, so the privacy of innocent victims of false positives is increased right off the bat.

After you cross the 30 matches threshold, a human review is triggered before anything else happens.

Given Google's reluctance to hire expensive human beings when an inaccurate algorithm is cheaper, I have no doubt they are turning over the account details for everyone with a single false positive.

So Apple is better than Google in terms of privacy here?


Apple isn't scanning anything yet, while Google has been scanning everything in your account for the past decade.

That's before you even consider things like Google scanning all your photos with face recognition software to identify all the people.

Hell, before they started running into server disk space issues last year, they uploaded all the photos you took in Android social media apps (without asking your permission) so they could do facial recognition on those photos too.
 
2021-09-04 10:39:17 PM  

recondite cetacean: I've not read that, but what's the difference between scanning before and after?


Why not read the objections from one of the 90+ civil rights organizations slamming the idea?

https://www.theregister.com/2021/08/19/apple_csam_condemned/

recondite cetacean: So Apple is better than Google in terms of privacy here?


Apple doesn't seem to understand that there is a massive difference between a cloud provider scanning things actively uploaded to their servers, versus your own device scanning your files.

It doesn't matter that Apple claims it'll only be used for this specific purpose, the data won't be sent to Apple unless some condition is met, blah blah blah. Once you allow the capability of a user's device scanning their files in order to report them for some specific reason it will be only be a matter of time before that is expanded to many reasons.
 
2021-09-04 11:17:23 PM  

trialpha: Apple doesn't seem to understand that there is a massive difference between a cloud provider scanning things actively uploaded to their servers, versus your own device scanning your files.


Why do you keep on lying about this?

Apple's plans were to scan nothing except photos you upload to iCloud Photos.

Given that Google and Microsoft have been scanning everything you upload to your account for the past decade, why shouldn't everyone refuse to have anything to do with Google and Microsoft?
 
2021-09-04 11:42:17 PM  

BullBearMS: Why do you keep on lying about this?

Apple's plans were to scan nothing except photos you upload to iCloud Photos.


Apple's plan was to have your own device scan your files prior to upload to iCloud. This is the problem. Had their plan been scanning the files on their servers after being uploaded, they wouldn't be getting blasted by 90+ civil rights organizations across the world.
 
2021-09-04 11:55:20 PM  
I predict that they will be required by Russia, China, Saudi Arabia etc to scan all files on your iPhone & Mac. This requirement will arrive within weeks of their on device scanning software going live.

It is a *HUGE* mistake. If they want to scan, do it on their own computers. Not computers that don't belong to them.
 
2021-09-05 12:36:06 AM  

trialpha: BullBearMS: Why do you keep on lying about this?

Apple's plans were to scan nothing except photos you upload to iCloud Photos.

Apple's plan was to have your own device scan your files prior to upload to iCloud. This is the problem. Had their plan been scanning the files on their servers after being uploaded, they wouldn't be getting blasted by 90+ civil rights organizations across the world.


Nope.

You're still lying.

Apple's plan was to scan photos you upload to iCloud Photos only.

If you turn off iCloud Photos, nothing was to be scanned at all.

Meanwhile, Google and Microsoft have been scanning everything in your account for a decade.
 
2021-09-05 12:40:31 AM  

kermit_the_frog: I predict that they will be required by Russia, China, Saudi Arabia etc to scan all files on your iPhone & Mac.


LOL.  You think they wouldn't order Google and Microsoft to scan all the files on Android devices and Windows boxes at the same time?
 
2021-09-05 2:02:18 AM  

trialpha: BullBearMS: Why do you keep on lying about this?

Apple's plans were to scan nothing except photos you upload to iCloud Photos.

Apple's plan was to have your own device scan your files prior to upload to iCloud. This is the problem. Had their plan been scanning the files on their servers after being uploaded, they wouldn't be getting blasted by 90+ civil rights organizations across the world.


He knows that, he just plugs his ears and pretends it doesn't exist when there's no argument against it, then acts like your point is actually something else that he can argue against. There's no point feeding him.
 
2021-09-05 2:17:35 AM  

dyhchong: trialpha: BullBearMS: Why do you keep on lying about this?

Apple's plans were to scan nothing except photos you upload to iCloud Photos.

Apple's plan was to have your own device scan your files prior to upload to iCloud. This is the problem. Had their plan been scanning the files on their servers after being uploaded, they wouldn't be getting blasted by 90+ civil rights organizations across the world.

He knows that, he just plugs his ears and pretends it doesn't exist when there's no argument against it, then acts like your point is actually something else that he can argue against. There's no point feeding him.


LOL.

Google and Microsoft haven't been scanning every file you upload to their cloud for the past decade?

Yes.  Yes they have been.

Apple announces that they found a way to scan their cloud service while protecting the innocent from facing suspicion from an inevitable false positive, and you guys suddenly decide to clutch the pearls?

Pathetic as usual.
 
2021-09-05 2:20:43 AM  

BullBearMS: dyhchong: trialpha: BullBearMS: Why do you keep on lying about this?

Apple's plans were to scan nothing except photos you upload to iCloud Photos.

Apple's plan was to have your own device scan your files prior to upload to iCloud. This is the problem. Had their plan been scanning the files on their servers after being uploaded, they wouldn't be getting blasted by 90+ civil rights organizations across the world.

He knows that, he just plugs his ears and pretends it doesn't exist when there's no argument against it, then acts like your point is actually something else that he can argue against. There's no point feeding him.

LOL.

Google and Microsoft haven't been scanning every file you upload to their cloud for the past decade?

Yes.  Yes they have been.

Apple announces that they found a way to scan their cloud service while protecting the innocent from facing suspicion from an inevitable false positive, and you guys suddenly decide to clutch the pearls?

Pathetic as usual.


k
 
2021-09-05 3:52:49 AM  

dyhchong: trialpha: BullBearMS: Why do you keep on lying about this?

Apple's plans were to scan nothing except photos you upload to iCloud Photos.

Apple's plan was to have your own device scan your files prior to upload to iCloud. This is the problem. Had their plan been scanning the files on their servers after being uploaded, they wouldn't be getting blasted by 90+ civil rights organizations across the world.

He knows that, he just plugs his ears and pretends it doesn't exist when there's no argument against it, then acts like your point is actually something else that he can argue against. There's no point feeding him.


I fail to see how pre-scanning files that are marked for upload on your device and scanning files that have just been uploaded is meaningfully different. Cloud upload is either totally on or totally off with an Apple device.
 
2021-09-05 4:11:04 AM  

Likwit: I fail to see how pre-scanning files that are marked for upload on your device and scanning files that have just been uploaded is meaningfully different. Cloud upload is either totally on or totally off with an Apple device.


The difference is your local courier demanding a key to your house so he can pick up and deliver your packages to/from your dining table so they don't get stolen on your doorstep.

Whether he has a key or not, those packages are destined to go to him, and come from him.

If he only uses the key to pick up and deliver packages, great.

None of the other delivery services have done that, they just pick up and deliver your packages to your doorstep. But this guy has promised to do only that and demanded a key to be able to do it.
 
2021-09-05 4:15:34 AM  
Oh, also he keeps the key if you stop using his company for packages (eg turn iCloud off, their ability to scan doesn't go away if you turn iCloud off, it just doesn't happen). So if he changes his mind he can now just let himself into the house at his own whim with his only restriction being his own morals.
 
2021-09-05 5:16:34 AM  

dyhchong: Oh, also he keeps the key if you stop using his company for packages (eg turn iCloud off, their ability to scan doesn't go away if you turn iCloud off, it just doesn't happen). So if he changes his mind he can now just let himself into the house at his own whim with his only restriction being his own morals.


Well... that was a terrible analogy. It's OK. My analogies are legendarily terrible.
 
Displayed 50 of 56 comments


Oldest | « | 1 | 2 | » | Newest | Show all


View Voting Results: Smartest and Funniest

This thread is closed to new comments.

Continue Farking




On Twitter


  1. Links are submitted by members of the Fark community.

  2. When community members submit a link, they also write a custom headline for the story.

  3. Other Farkers comment on the links. This is the number of comments. Click here to read them.

  4. Click here to submit a link.