Linked by Thom Holwerda on Wed 12th Dec 2012 22:03 UTC
Google A change to anything related to Google Search - the product so many of us rely on - is never going to go by unnotoced. This time around, Google has altered Image Search for US users to alter the way it handles that ever so important aspect of the web - adult content.
Thread beginning with comment 545263
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: Totally Predictable...
by galvanash on Sat 15th Dec 2012 08:29 UTC in reply to "RE: Totally Predictable..."
galvanash
Member since:
2006-01-25

Did you even try? If you are explicit enough, and "tittyfucking" is pretty explicit, you don't need to add a qualifier at all - it works as is (assuming safeSearch is turned off).

The algorithm may not be perfect, but it is pretty good. Words that may have non-pornographic meanings (for example, "fuck" is commonly used as an expletive) are handled by default in a manner to filter out pornographic content, you have to be more specific in order to get them to return pornography

For example, "stupid fuck" filters out almost all porn images (although a few get through), "redhead fuck" doesn't - lots of porn!

Is that really so bothersome? I really think everyone has just had a knee-jerk reaction to this feature - it is actually quite logical to me, no politics involved...

Reply Parent Score: 2

UltraZelda64 Member since:
2006-12-05

Yes, I did briefly test both with and without using "porn" as a modifier and omitting it led to very sub-par results. I already verified it a couple days ago with other similarly-obvious terms including blowjob. The words "naked" and "nude" seemed to act as very strong modifiers for bringing up expected results in my testing, but they both led to a completely different set of images.

Reply Parent Score: 2