Apple has postponed plans to implement a child image scan that will reveal child molestation situations. Following a massive outcry from the public, the iPhone maker said it is suspending the plan, which would have rolled out before the end of the year. When the plan will be executed remains unknown, but Apple said the plan remains suspended until a better time.

The NeuralHash technology is designed to scan iPhones for child images against a database maintained by the National Center for Missing and Exploited Children (NCMEC). As soon as a match is found between a child abuse image on a user’s iPhone and that on the NCMEC database, it is flagged to be reviewed by a human editor.

If the editor confirms that the image is actually of child sexual molestation, then the image and the identified user are reported to law enforcement for necessary action. The technology is also able to scan images uploaded to iCloud Photos for relevant matches that could indicate child sexual molestation cases.

The plan for the image scan was announced in August, but Apple said there was a massive outcry from various corners of the country. One of the most vocal groups, the Electronic Frontiers Foundation (EFF), got more than 25,000 petitioners to counter the planned image scan.

“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple stated.

The tech company revealed that “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of child sexual abuse material,” but the public outrage is causing a reconsideration of the plan.

Critics of the plan said scanning iPhone users’ devices is a violation of their privacy and that it could empower authoritarian governments to spy on people. Given that Apple had always promised top protection for its customers, users said implementing the iPhone scan is a violation of its own security pledges.

“The company must go further than just listening and drop its plans to put a backdoor into its encryption entirely,” EFF’s executive director Cindy Cohn said. “The enormous coalition that has spoken out will continue to demand that user phones – both their messages and their photos – be protected and that the company maintains its promise to provide real privacy to its users.”