Apple’s role as moralistic gatekeeper of its App Store is well known: co-founder Steve Jobs famously called it Apple’s “moral responsibility to keep porn off the iPhone.” And when it comes to third-party apps that violate App Store rules of this nature, like making it easy to search for any pornographic content, Apple is pretty quick to take those apps down. We saw this just last week with the high-profile example of 500px. But it’s not clear how consistently Apple is willing to enforce those rules when one of those apps in violation is from a trusted partner company.
Twitter’s recent relaunch of Vine — an app that lets users upload short, looping videos and share them in tweets — has attracted a bunch of attention for the platform’s ability to use the short videos for pornographic content. And there are plenty of examples. See these recent headlines:
So far, Apple has not only let it slide, it featured the new Vine app in the App Store as an Editor’s Choice last Friday. There are plenty of appropriate uses of Vine, and it seems most are using the service without violating Apple’s rules. But the situation Apple is facing with Vine shows the perils of trying to enforce a set of rules that are basically impossible to apply consistently across an App Store of nearly 800,000 apps.
The situation leads raises two questions: How different does Apple treat its partners versus regular developers? And shouldn’t Apple care more about Vine displaying porn, since Twitter is integrated into both iOS and OS X?
I’ve reached out to Apple for comment about Vine and will update this story if I hear back.