When morality loses to search; Google sued over celebrity nudes

Early last week we learned that Google is being sued by a group of celebrities whose nude photos were leaked to the Internet through some form of online attack that gained access to their iCloud photos.

The technology community’s reaction was swift and predictable. Its general tone was best captured at Ars Technica, which saw comments such as this:

When will these dinosaurs learn? You can’t scrub content, especially images, off the internet entirely.

And this.

People will say that I’m “blaming the victim” but really the only way to ensure naked photos from appearing on the internet is to never upload them to the internet in the first place.

And this.

Lawsuits to shut up people on the internet just don’t work.

If you took the name away from the company being sued and tried to guess it via reaction to the article, you’d think it was some helpless small business being victimized by a scummy lawyer and greedy celebrities.

But that, of course, is not what we’re talking about. We’re talking about Google, the world’s largest search company and one of the world’s largest advertisers. Surely they have the resources necessary to respond to takedown requests in a timely manner? Surely they have some responsibility to handle the content they display?

That is, in fact, the point being made by the celebrity’s lawyer, who notes (in a quote not often cited by technology websites covering this piece) that:

Ever since the hacked images first began to be posted on websites and blogs [...] we have been sending notices to various website operators and host providers [...]. The vast majority of those sites and ISPs/hosts, all of which are much smaller than Google, with fewer staff and resources, complied with their obligations under the DMCA and removed the Images within an hour or two of receiving our DMCA notice.

Google did not remove the images quickly. It still has not, as a Google Image Search for most of the celebrities victimized by the leak will turn up the leaked photos (along with a long list of photoshops). Some of the photos are hosted on domains that are literally the celebrity’s name followed by the word “nude” behind it, and all of them are obviously indexed well enough to make them visible in Google’s search engine. The company knows how to find them. Yet they’re still visible, still obtainable from the top of Google’s search results eight days after this lawsuit was filed.

It’s incredible to me that so many people are happy to give Google a pass on this with the usual hand waving about openness and freedom. They are, in effect, elevating the “rights” of a set of search results above the rights of real human beings. This is the sci-fi dystopia where the value of data trumps even the most basic morality, and it’s happening right now.

At times like these I’m reminded of Jason Lanier’s You Are Not A Gadget, a book that sharply points out that technology is not a unquestionable instrument deliver from the heavens but simply a tool built by humans with their own motivations. To quote him:

I fear that we are beginning to design ourselves to suit digital models of us, and I worry about a leaching of empathy and humanity in that process.

I worry about that, too. Not because technology is naturally de-humanizing, but because there’s always a wizard behind the curtain designing the models we conform to. And we’re becoming very, very good at refusing to acknowledge he exists.