Apple pulled 500px photography app over child porn complaints

After a bit of confusion over why a popular photography app was pulled from the iOS App Store early Tuesday morning, Apple offered up an official explanation: it had no choice but to pull the app because of complaints of child pornography viewable within the app.

“The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app,” an Apple spokesman said in a statement Tuesday evening.

From the version of the story of the app removal 500px’s COO Evgeny Tchebotarev told Techcrunch, it’s not clear he was aware of the child porn claims, just a general violation of the pornography rule violation.

The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these type of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.

Apple has been (mostly) clear from the beginning that it doesn’t admit apps that promote excessive violence, pornography or attacks on religion to its App Store. Steve Jobs himself called it Apple’s “moral responsibility to keep porn off the iPhone.”

While there can be different opinions of what constitutes artistic nude photography versus pornography, when children are involved, there’s no wiggle room. If Apple received such a complaint, it had to take the app down.


GigaOM