This article is from the archive of our partner .

Twitter was quick to defend itself Monday morning after a pornographic clip briefly made its way in front of all users on the Editor's Picks of its controversial new video-sharing iPhone app, Vine. The company said in a statement that the video had been removed and that the mishap was the result of "a human error," but what does that mean, exactly? And how much of a porn problem does Vine already have?

This could have happened because, well, there are already enough borderline dirty videos on the five-day-old service for a human being to have accidentally placed a popular video on the best-of list without checking first. But the "human error" excuse from Twitter could also mean a human at the company designed an algorithm that put the video in front of everybody — confirming the dirty videos really are that popular. Either way, it would appear that Twitter's early porn problems with Vine (and with notoriously anti-porn Apple) are real and getting bigger, with NSFW videos becoming dominant enough to be causing more problems for the staff, if not users everywhere. A search for certain inappropriate hashtags — #NSFW, #NSFWVine, and #Dildos, all of which were somehow involved in the "editor's pick" today — shows that it's not just a few dirty clips here and there. No, it appears that Twitter has already created a hub for six-second-long user-generated porn. 

Twitter has apologized to its users for the error, but it's unclear how and if the company will adjust Vine to address the proliferation of porn. Perhaps that humans who pushed the video to the top will act more carefully. Or maybe the app will adjust its algorithm. It already looks like the app is cracking down on porn-related hashtags. 

Twitter could, of course, rid all the NSFW content from Vine, but the company's longstanding anti-censorship policy suggests the x-rated content may be on its new app to stay. More damaging could be a move by Apple to pull Vine from iTunes. Apple has cracked down on rule-breakers of its strict anti-nudity policy, including just last week with the photo-sharing app 500px, and a year ago with the "Instagram for Video" app Viddy.

Since initial complaints over the weekend, Vine has at least started marking many videos as "potentially offensive" with a blacked-out warning message, as you can see in the screengrab at left — of the "editor's pick" video that stirred up the controversy again. Generally wouldn't be enough to placate Apple. The official iTunes policy states: "Apps containing pornographic material, defined by Webster's Dictionary as 'explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings,' will be rejected." Apple already accepted the Twitter app, but 500px got pulled because it featured pornographic imagery. So far, Apple hasn't commented on the situation. 

This article is from the archive of our partner The Wire.

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.