loading...
online chat rooms without registration
cramer john بازدید : 56 شنبه 13 شهریور 1400 نظرات (0)

Starting later this year, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies.

Apple says the method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, Apple said the system will perform on-device matching against a database of k How to Download and Install iOS 15nown CSAM image hashes provided by the NCMEC and other child safety organizations. It will transform this database into an unreadable set of hashes that is securely stored on users' devices.

ارسال نظر برای این مطلب

کد امنیتی رفرش
اطلاعات کاربری
  • فراموشی رمز عبور؟
  • آمار سایت
  • کل مطالب : 17
  • کل نظرات : 0
  • افراد آنلاین : 1
  • تعداد اعضا : 0
  • آی پی امروز : 12
  • آی پی دیروز : 5
  • بازدید امروز : 27
  • باردید دیروز : 2
  • گوگل امروز : 0
  • گوگل دیروز : 0
  • بازدید هفته : 36
  • بازدید ماه : 143
  • بازدید سال : 785
  • بازدید کلی : 2,174