@barray on Thu Aug 19 00:10:40 UTC 2021 said: &eAnd now the #source #code is out there for generating your own #csam #hash #collision that can fool #apple 's #detection #algorithm https://github.com/anishathalye/neural-h.. Apple claim it will be impossible to figure out what #vector they are using to trigger detection, but it's only a matter of time before somebody figures them out with a #database of #childporn (something the #uk #police for example store). So essentially #stateactors have the ability to #reverseengineer their vectors.Scratch that, it looks as if people are actually generating #hash #collisions by modifying an existing image to match a #hash https://github.com/AsuharietYgvar/AppleN.. #apple have really fucked up on the response to this one... They should at the very least put #csam on ice until #researchers have been given more time to figure this out. Also, claims that it's "not the final #version " is not good enough, especially when you invited people to #attack it!