felter wrote: ↑Fri, 6. Aug 21, 05:08
Onto something totally different. Apple is going to be actively searching for child pornography on Iphone and Ipads, they have software they have designed to look at any picture that is stored on the iCloud, comparing their hash to a register of known child pornography images. Am I the only one who has an issue with this, fair play that they want to stop child pornography I'm all for that, but they are actively going to search for child pornography, when they think they have found some they are going to view the potential child pornography before reporting it to the police. They are a private company, they are not employed by either the police or any government agency to search out child pornography, but they are going to actively search for these pictures, so they can look at them before deciding if a crime has been committed. It's only going to be done in the US, probably the reason for this is a lot of places have laws against searching and looking at child pornography. Some people are getting up against the privacy issue but to me if they are looking for child pornography without the authorization of some kind of law enforcement agency, then they are not any better than the ones who they are reporting, they will just be paedophiles with a conscience at the end of the day.
You are missing the point here. Tech companies scan for child abuse material
because of government pressure to do so. Tech companies won't lift a finger to do anything that costs them money out of the goodness of their hearts. Most tech companies already check any files they hold on their servers against lists of hashes, e.g.
https://www.iwf.org.uk/our-services/hash-list. This is because people sharing child abuse imagery over messaging apps, social media networks, and the like is a huge problem - one of the worst consequences of the internet is how easy it's made access to this sort of material - and tech companies scanning for hash lists is a compromise where the alternative is governments making laws demanding direct access to check.
This story is in the news because Apple is moving the scanning process from iCloud's servers to iPhones themselves, likely as a precursor to providing end-to-end encryption of iCloud backups (after which they will no longer be able to scan server-side, obviously). I assume most privacy concerns are because of a perceived difference in invasiveness of the scanning happening server-side vs device side. There's also a vague slippery slope argument being made that "if they can check for this what will they be asked to check for next", but I'm not sure the people making that argument understand the extent to which scanning for child abuse material already happens. Doing these sorts of scans device side is likely the only way to prevent end to end encrypted chat apps like most modern messenger apps just becoming a way of sharing child abuse material in bulk undetectably, so I imagine we'll start seeing it more.
edit to add: It's also well known that the moderators of big social networks etc get exposed to an enormous amount of this sort of material in the course of their work (and will regardless of whether any hash-bashed checking is happening of course), so "employees of tech firms having to check child abuse imagery at work" is not a new issue either. Facebook in particular is notorious for underpaying and overworking these people and not providing them adequate mental health support.
A still more glorious dawn awaits, not a sunrise, but a galaxy rise, a morning filled with 400 billion suns - the rising of the Milky Way