d3ad social login

[d3ad]

tags labelled 'csam' e

please login to post
@barray on Tue Jun 07 09:36:17 UTC 2022 said: &e
The #eu is trying to push #csam to #scan your #private #communications in the name of " #protecting the #children ": https://tutanota.com/blog/posts/eu-surve.. What this will really do is increase the scope of the #surveillance state... This is about #control .
@barray on Wed Dec 15 16:32:50 UTC 2021 said: &e
#apple finally backing down on #csam #scanning on #devices that would most definitely end up being #abused by #state #actors and #governments in general to #hunt for people: https://www.macrumors.com/2021/12/15/app.. Looking at the #website #comments , it looks like there is some #blame of #hongkong #protestors too...
@barray on Sun Sep 05 09:25:10 UTC 2021 said: &e
Decent #article against #apple 's #csam plans: https://www.theatlantic.com/ideas/archiv.. Great quote from the article: "The logic of catching a few evil actors by denying the cloak of privacy to everyone will inexorably expand to more and more areas that powerful societal factions want to target." Can't get much more to the point than that.
@barray on Sat Sep 04 06:04:15 UTC 2021 said: &e
#apple has now delayed the #csam rollout after pressure from around the world: https://www.macrumors.com/2021/09/03/app.. Still, it is only delayed, not cancelled. Apple can officially not be trusted, this backdoor into your #device *will* be #exploited ... I understand their intentions are good, but even the best of intentions can lead to the worst of #consequences - just look at #communism !
@barray on Tue Aug 24 05:40:44 UTC 2021 said: &e
Apparently #apple have been using #csam to #scan #icloud for the last two years: https://9to5mac.com/2021/08/23/apple-sca.. Fun! According to Apple's #antifraud #chief , they are "the greatest #platform for #distributing #childporn " - ouch. Assuming they have been using the #csam algorithm this whole time, we must ask whether it has *already* been abused by #hostile #states in order to detect #journalists or people of interest.
@barray on Thu Aug 19 18:02:03 UTC 2021 said: &e
#apple 's #csam is even worse than previous thought, people are finding #hash #collisions in the wild: https://blog.roboflow.com/nerualhash-col..
@barray on Thu Aug 19 02:26:49 UTC 2021 said: &e
@barray on Thu Aug 19 00:10:40 UTC 2021 said: &e
Scratch that, it looks as if people are actually generating #hash #collisions by modifying an existing image to match a #hash https://github.com/AsuharietYgvar/AppleN.. #apple have really fucked up on the response to this one... They should at the very least put #csam on ice until #researchers have been given more time to figure this out. Also, claims that it's "not the final #version " is not good enough, especially when you invited people to #attack it!
And now the #source #code is out there for generating your own #csam #hash #collision that can fool #apple 's #detection #algorithm https://github.com/anishathalye/neural-h.. Apple claim it will be impossible to figure out what #vector they are using to trigger detection, but it's only a matter of time before somebody figures them out with a #database of #childporn (something the #uk #police for example store). So essentially #stateactors have the ability to #reverseengineer their vectors.
@barray on Thu Aug 19 00:10:40 UTC 2021 said: &e
@barray on Thu Aug 19 00:00:50 UTC 2021 said: &e
#apple are claiming there is no issue with their #csam #detection #algorithm because #researchers are not using the "final version", despite being invited to #test it: https://www.vice.com/en/article/wx5yzq/a.. The lack of awareness on this is actually shocking... Even if researchers can't produce #hash #collisions on the #private version, #independent #nations can!
Scratch that, it looks as if people are actually generating #hash #collisions by modifying an existing image to match a #hash https://github.com/AsuharietYgvar/AppleN.. #apple have really fucked up on the response to this one... They should at the very least put #csam on ice until #researchers have been given more time to figure this out. Also, claims that it's "not the final #version " is not good enough, especially when you invited people to #attack it!
@barray on Thu Aug 19 00:00:50 UTC 2021 said: &e
@barray on Wed Aug 18 10:48:11 UTC 2021 said: &e
Just a few days ago somebody replicated the #apple #csam #detection #algorithm https://github.com/AsuharietYgvar/AppleN.. Just now, somebody has successfully produced a #hash #collision - *exactly* what people were concerned about: https://github.com/AsuharietYgvar/AppleN.. Apparently it's even worse that this - different #hardware generates a different hash because of #float #precision ...
#apple are claiming there is no issue with their #csam #detection #algorithm because #researchers are not using the "final version", despite being invited to #test it: https://www.vice.com/en/article/wx5yzq/a.. The lack of awareness on this is actually shocking... Even if researchers can't produce #hash #collisions on the #private version, #independent #nations can!
@barray on Wed Aug 18 10:48:11 UTC 2021 said: &e
Just a few days ago somebody replicated the #apple #csam #detection #algorithm https://github.com/AsuharietYgvar/AppleN.. Just now, somebody has successfully produced a #hash #collision - *exactly* what people were concerned about: https://github.com/AsuharietYgvar/AppleN.. Apparently it's even worse that this - different #hardware generates a different hash because of #float #precision ...
@barray on Wed Aug 18 04:47:49 UTC 2021 said: &e
It looks like even the #german #government is urging #apple to reconsider their #csam #scanning of user's #files https://appleinsider.com/articles/21/08/.. Hopefully they get rid of the #backdoor ...
@barray on Sat Aug 14 07:04:48 UTC 2021 said: &e
#apple completely miss the point by continuing to push #csam #child #safety #measures https://www.reuters.com/technology/after.. The point is that these powers will expand and #freedom will erode. Apple themselves literally say: "This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time."
@barray on Wed Aug 11 10:53:53 UTC 2021 said: &e
Looks like the #eu join #apple in #csam #child #pornography #protections https://www.patrick-breyer.de/en/posts/m.. Now companies have permission to search *all* #private #encrypted #messages for any offensive material. *I'm sure this won't be misused in the near future.* They have taken away #chatcontrol and #secrecyofcorrespondence from #europeans - thank goodness for #uk and #brexit ! I suspect #boris will just roll out something worse though...