3 Comments
Feb 6·edited Feb 6

"Save Taylor Swift?" For heaven's sake, get a grip. The medium is the message here. Whether you're watching the Olympics, a moon landing, a building collapsing, a fistfight, a murder, or deep fake porn, in the end you're just watching TV--an arrangement of coloured dots on a screen. None of it is real, and everybody knows it. Any number of actresses who would never in a million years appear nude in public have willingly allowed simulations of themselves to appear nude on screens because they understand this difference. They aren't being "abused," no matter how many viewers regard their screen simulations; and neither is Taylor Swift in this case.

If we're going to indulge in moral panics, let's at least confine them to issues involving actual harm to people. Prohibitions against harmful behaviour one can at least understand; the impulse to censor screen images has never made any sense. All manner of alarming situations and despicable behaviours get depicted on screens that we wouldn't hope to see in real life, and that's fine: what would cinema be without this freedom? Whether or not Taylor Swift posed for the pictures in question really is irrelevant here, as is the extent to which the images correspond to anything in the real world. Either way, it isn't her on the screen.

Where Ms. Swift may have a legitimate grievance is if someone is commercially profiting from her image in some illegitimate way, which is a different matter. But there's nothing in this incident that warrants the scandalized tone of the article, or could be said to legitimize the hysterical overkill of blocking searches on Taylor Swift altogether, even "temporarily." Such overreaction sets an unfortunate precedent that would-be censors will almost certainly try to exploit.

P.S. Pretty much one hundred percent of this bizarre uproar is attributable to our hypocritical, irreformably puritanical approach to sexual matters. Photoshopping heads onto bodies has been within our capacity for years, and of course we've gotten better at it and other kinds of image-tampering, a trend increasing competence with A.I. tools will doubtless accelerate. But nobody would right now be blocking internet searches, or calling for more internet controls, simply because some Facebook user had doctored a photo to make it appear that he/she and Taylor Swift were baking cookies together. A picture like that might generate some debate, but it wouldn't precipitate a moral panic.

Expand full comment

As always, the porn industry seems to be leading the adoption of new technology. A broader concern is where else will AI-enabled deepfake technology make an impact on our lives? I'm concerned about the ability of political partisans or foreign actors to introduce fake and fraudulent videos into public discourse.

Mere photoshopped pictures attached to satirical articles from "The Onion" and "The Babylon Bee" are already interpreted as real news all too often and require debunking for years afterwards. What happens when somebody deepfakes video of a politician giving a speech as if they're saying something truly scandalous?

Changing the law to make AI-enabled fakes illegal is something, but I'm not sure how enforceable it is. I think the answer here may be more along the lines of setting up a verification scheme that would be voluntarily be used to authenticate *real* content, a bit like Verisign authentication for websites. It'd be far from foolproof, but it would be a quick test to sort out the good actors from the rest.

Expand full comment

As a bumbling hillbilly I’m deeply offended by your comment….”the Justin Trudeau government which, when it comes to issues involving the internet, has so far behaved like a band of bumbling hillbillies”.

Expand full comment