Great Britain
This article was added by the user . TheWorldNews is not responsible for the content of the platform.

Apple scraps plans to scan iPhones for child abuse images

Apple has officially scrapped its controversial plan to scan iCloud images for Child Sexual Abuse Material (CSAM).

On Wednesday, the company announced that it would not be moving forward with its plans for on-device scanning.

‘We have further decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos,’ said Apple in a statement to Wired.

‘Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,’

In August 2021, Apple announced that it was in the process of developing a system which would automatically recognise illegal images when they were uploaded to iCloud and alert the authorities.

The initial plan was that iPhone users’ entire photo libraries would be checked for known child abuse images if they were stored on its iCloud service.

Later, the technology giant announced the launch had been pushed back to ‘make improvements’ after campaigners said the programme breached privacy standards.

Some suggested the tool could be hijacked by authoritarian governments to look for other types of images.

Now, it seems the plan has been scrapped entirely and the company will focus on deepening its investment in the ‘Communication Safety’ feature.

In April, Apple released the child safety feature to detect nudity in messages using artificial intelligence (AI).

The feature was launched in the US last year and then expanding to the Message apps on iOS, iPadOS, and macOS in the UK, Canada, New Zealand, and Australia.

Now parents can turn on warnings on their children’s iPhones so that all photos sent or received by the child on Apple Message will be scanned for nudity.

Once enabled, if nudity is found in photos received by a child, the photo will be blurred, and the child will be warned that it may contain sensitive content and be nudged towards resources from child safety groups.

MORE : Apple is launching ‘advanced’ security for chats and pictures on iCloud

MORE : Two women sue Apple claiming ex-partners used AirTags to stalk them

Get your need-to-know latest news, feel-good stories, analysis and more