The #eu is trying to push #csam to #scan your #private #communications in the name of " #protecting the #children ": https://tutanota.com/blog/posts/eu-surve.. What this will really do is increase the scope of the #surveillance state... This is about #control .
#apple finally backing down on #csam #scanning on #devices that would most definitely end up being #abused by #state #actors and #governments in general to #hunt for people: https://www.macrumors.com/2021/12/15/app.. Looking at the #website #comments , it looks like there is some #blame of #hongkong #protestors too...
Decent #article against #apple 's #csam plans: https://www.theatlantic.com/ideas/archiv.. Great quote from the article: "The logic of catching a few evil actors by denying the cloak of privacy to everyone will inexorably expand to more and more areas that powerful societal factions want to target." Can't get much more to the point than that.
#apple has now delayed the #csam rollout after pressure from around the world: https://www.macrumors.com/2021/09/03/app.. Still, it is only delayed, not cancelled. Apple can officially not be trusted, this backdoor into your #device *will* be #exploited ... I understand their intentions are good, but even the best of intentions can lead to the worst of #consequences - just look at #communism !
Apparently #apple have been using #csam to #scan #icloud for the last two years: https://9to5mac.com/2021/08/23/apple-sca.. Fun! According to Apple's #antifraud #chief , they are "the greatest #platform for #distributing #childporn " - ouch. Assuming they have been using the #csam algorithm this whole time, we must ask whether it has *already* been abused by #hostile #states in order to detect #journalists or people of interest.
#apple 's #csam is even worse than previous thought, people are finding #hash #collisions in the wild: https://blog.roboflow.com/nerualhash-col..
@barray on Thu Aug 19 00:10:40 UTC 2021 said: &eAnd now the #source #code is out there for generating your own #csam #hash #collision that can fool #apple 's #detection #algorithm https://github.com/anishathalye/neural-h.. Apple claim it will be impossible to figure out what #vector they are using to trigger detection, but it's only a matter of time before somebody figures them out with a #database of #childporn (something the #uk #police for example store). So essentially #stateactors have the ability to #reverseengineer their vectors.Scratch that, it looks as if people are actually generating #hash #collisions by modifying an existing image to match a #hash https://github.com/AsuharietYgvar/AppleN.. #apple have really fucked up on the response to this one... They should at the very least put #csam on ice until #researchers have been given more time to figure this out. Also, claims that it's "not the final #version " is not good enough, especially when you invited people to #attack it!
@barray on Thu Aug 19 00:00:50 UTC 2021 said: &eScratch that, it looks as if people are actually generating #hash #collisions by modifying an existing image to match a #hash https://github.com/AsuharietYgvar/AppleN.. #apple have really fucked up on the response to this one... They should at the very least put #csam on ice until #researchers have been given more time to figure this out. Also, claims that it's "not the final #version " is not good enough, especially when you invited people to #attack it!#apple are claiming there is no issue with their #csam #detection #algorithm because #researchers are not using the "final version", despite being invited to #test it: https://www.vice.com/en/article/wx5yzq/a.. The lack of awareness on this is actually shocking... Even if researchers can't produce #hash #collisions on the #private version, #independent #nations can!
@barray on Wed Aug 18 10:48:11 UTC 2021 said: &e#apple are claiming there is no issue with their #csam #detection #algorithm because #researchers are not using the "final version", despite being invited to #test it: https://www.vice.com/en/article/wx5yzq/a.. The lack of awareness on this is actually shocking... Even if researchers can't produce #hash #collisions on the #private version, #independent #nations can!Just a few days ago somebody replicated the #apple #csam #detection #algorithm https://github.com/AsuharietYgvar/AppleN.. Just now, somebody has successfully produced a #hash #collision - *exactly* what people were concerned about: https://github.com/AsuharietYgvar/AppleN.. Apparently it's even worse that this - different #hardware generates a different hash because of #float #precision ...
Just a few days ago somebody replicated the #apple #csam #detection #algorithm https://github.com/AsuharietYgvar/AppleN.. Just now, somebody has successfully produced a #hash #collision - *exactly* what people were concerned about: https://github.com/AsuharietYgvar/AppleN.. Apparently it's even worse that this - different #hardware generates a different hash because of #float #precision ...
It looks like even the #german #government is urging #apple to reconsider their #csam #scanning of user's #files https://appleinsider.com/articles/21/08/.. Hopefully they get rid of the #backdoor ...
#apple completely miss the point by continuing to push #csam #child #safety #measures https://www.reuters.com/technology/after.. The point is that these powers will expand and #freedom will erode. Apple themselves literally say: "This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time."
Looks like the #eu join #apple in #csam #child #pornography #protections https://www.patrick-breyer.de/en/posts/m.. Now companies have permission to search *all* #private #encrypted #messages for any offensive material. *I'm sure this won't be misused in the near future.* They have taken away #chatcontrol and #secrecyofcorrespondence from #europeans - thank goodness for #uk and #brexit ! I suspect #boris will just roll out something worse though...