Twitter was quick to defend itself Monday morning after a pornographic clip briefly made its way in front of all users on the Editor's Picks of its controversial new video-sharing iPhone app, Vine. The company said in a statement that the video had been removed and that the mishap was the result of "a human error," but what does that mean, exactly? And how much of a porn problem does Vine already have?
This could have happened because, well, there are already enough borderline dirty videos on the five-day-old service for a human being to have accidentally placed a popular video on the best-of list without checking first. But the "human error" excuse from Twitter could also mean a human at the company designed an algorithm that put the video in front of everybody — confirming the dirty videos really are that popular. Either way, it would appear that Twitter's early porn problems with Vine (and with notoriously anti-porn Apple) are real and getting bigger, with NSFW videos becoming dominant enough to be causing more problems for the staff, if not users everywhere. A search for certain inappropriate hashtags — #NSFW, #NSFWVine, and #Dildos, all of which were somehow involved in the "editor's pick" today — shows that it's not just a few dirty clips here and there. No, it appears that Twitter has already created a hub for six-second-long user-generated porn.