Being part of one of the world’s top tech companies, Apple said that its latest software update, which is due to release at the end of the year 2021 will include features to identify and fight against child sex abuse images and videos. The tech company said on its Child Safety section of the website that they are focused on stopping the spread of Child Sexual Abuse Material also known as CSAM. Apple plans on scanning and monitoring data from all its devices.
The main features and areas of focus for Apple are
Apple messenger app will immediately prompt a warning to children and their parents if any sexually explicit images are received. This gives the first line of defence for CSAM detection and protection of children.
If the user will receive a sexually explicit image then the image will be blurred as that feature is already in place on Facebook Feed. The user will then have to click the image or give permission to view the image. This protection is not only for children to receive an image but if any child tries to send an image that is sexual in nature the parents would be notified.
Apple will be scanning and reviewing data in iCloud and all its devices for CSAM which stands for Child Sexual Abuse Material. Apple has helped authorities before in missing person cases and other activities by scanning and monitoring data from its users. The advocates for user privacy are very much concerned by the breach of privacy agreements that users have accepted from Apple and consider this user’s civil rights being hampered upon. The news has had its share of critics as the majority of the users are concern about their privacy and the data being misused.
People who need help regarding CSAM will be guided to the nearest and related authority or personnel to help them. Anyone who searches for material related to CSAM will be prompted. All these updates will be coming at the end of the Year 2021 in iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.