A simpler way to remove explicit images from Search

41 points by gnabgib 18 hours ago on hackernews | 29 comments

guessmyname | 17 hours ago

Why is Google indexing these harmful images in the first place?

Microsoft, Google, Facebook, and other large tech companies have had image recognition models capable of detecting this kind of content at scale for years, long before large language models became popular. There’s really no excuse for hosting or indexing these images as publicly accessible assets when they clearly have the technical ability to identify and exclude explicit content automatically.

Instead of putting the burden on victims to report these images one by one, companies should be proactively preventing this material from appearing in search results at all. If the technology exists, and it clearly does, then the default approach should be prevention, not reactive cleanup.

whatevermom5 | 17 hours ago

They probably make money showing pork search results

FrankBooth | 17 hours ago

That sounds haram.

ahofmann | 17 hours ago

Filthy pork addicts...

whatevermom5 | 16 hours ago

Oink oink

osmsucks | 16 hours ago

Dylan16807 | 16 hours ago

How is an image model supposed to detect if there was consent to share the picture?

If you're saying they shouldn't index any explicit images, you're talking about something very different from the article.

drdaeman | 16 hours ago

I think that “one by one” part allows different interpretations of what guessmyname possibly meant.

But I fail to make sense of it either way. Either the nuance of lack of consent is missing, or Google is blamed for not doing what they just did from the very first version.

vasco | 16 hours ago

I don't see how those religious groups that forced card payment processors to ban pornhub et al are not going to abuse this by mass reporting any nude picture they find as their own.

BuyMyBitcoins | 16 hours ago

>”those religious groups that forced card payment processors to ban pornhub et al”

I question how much influence such groups actually have, given that payment processors already dislike dealing with adult oriented businesses.

The percentage of chargebacks and disputes made for those transactions is significantly higher than any other category. Companies hate having to investigate such claims and issue new cards, even when it appears fairly obvious the purchase was made by the cardholder. It’s also tricky from a customer service standpoint, because the cardholder may likely be lying in order to hide an embarrassing purchase from a spouse or other family member.

It seems like payment processors just want to get rid of a hassle for themselves.

vasco | 15 hours ago

Well the religious groups certainly take the credit for themselves and continue their quest, latest was Steam.

https://www.collectiveshout.org/progress_in_global_campaign_...

mlindner | 16 hours ago

Google practically never shows explicit images to anyone anymore anyway. Even bing doesn't anymore. I feel like we've returned to a more prude society, at least on the mainstream internet.

advisedwang | 16 hours ago

I don't think it's prudish to want the ability to take down deepfakes of you naked or leaked images of you.

Dylan16807 | 16 hours ago

Your comment has basically no connection to the comment you replied to. (Which itself had a weak connection to the article, but that's a separate issue.)
The article is about removing non-consensual sexually explicit images and deepfakes

ReptileMan | 14 hours ago

And on whom falls the burden of proof that it is non-consensual or deepfake? We danced similar dance with DMCA takedowns.
It’s a small change to an existing suite of features Google provides around preventing personal information from being exposed in search results.

https://myactivity.google.com/results-about-you/

The burden was always on the victims.

Dylan16807 | 3 hours ago

The article is. mlindner's comment isn't.
The article mentions they're introducing a new way to request the removal of non-consensual explicit images on Search

the key bit is non-consensual, so it's unrelated to individual morality and they're providing a way to report a real crime

robocat | 16 hours ago

  Please don't comment on whether someone read an article.
https://news.ycombinator.com/newsguidelines.html

selcuka | 16 hours ago

The GP comment is in compliance with the guideline:

> Please don't comment on whether someone read an article. "Did you even read the article? It mentions that" can be shortened to "The article mentions that".

"You should really read the article" is semantically the same as "The article mentions that". It's not a question.

k

efilife | 16 hours ago

Even duckduckgo started censoring their results! Although very subtly. My friend (really) showed me an explicit search query with the safe search turned off that I then compared with yandex, and was really surprised how different they were. Nothing explicit on DDG, even though it included the word "hentai".

(I am aware this is not really related to the article. I think this is a cool discussion to be had)

etchalon | 16 hours ago

"Please don't regulate us" step 6,438.

dannyw | 16 hours ago

Looks like a nice and well designed improvement that will help people.

I can see this is related to the sad and ongoing ‘purification’ of the internet, but still, not going to get upset over better UX for taking down deepfakes or non-consensual explicit images which do hurt people.

xeyownt | 16 hours ago

So you can pinpoint to Google what image is of high (damaging) value and Google show you more of these.

What could go wrong?

dannyw | 16 hours ago

Oh wow. Actually, that’s a really good point; I’m not sure how you could counter that (lots of regulations ~do not allow Google to hide reporting/takedown flows etc behind an account).

Hopefully Google didn’t just build the world’s best deepfake search…

Why limited to sexual images?

Why can I not control whether any kind of public image of me at all appears?

You can - it’s all part of their Results About You tool

https://myactivity.google.com/results-about-you/