Linked by Thom Holwerda on Wed 12th Dec 2012 22:03 UTC
Google A change to anything related to Google Search - the product so many of us rely on - is never going to go by unnotoced. This time around, Google has altered Image Search for US users to alter the way it handles that ever so important aspect of the web - adult content.
Permalink for comment 545263
To read all comments associated with this story, please click here.
RE[2]: Totally Predictable...
by galvanash on Sat 15th Dec 2012 08:29 UTC in reply to "RE: Totally Predictable..."
Member since:

Did you even try? If you are explicit enough, and "tittyfucking" is pretty explicit, you don't need to add a qualifier at all - it works as is (assuming safeSearch is turned off).

The algorithm may not be perfect, but it is pretty good. Words that may have non-pornographic meanings (for example, "fuck" is commonly used as an expletive) are handled by default in a manner to filter out pornographic content, you have to be more specific in order to get them to return pornography

For example, "stupid fuck" filters out almost all porn images (although a few get through), "redhead fuck" doesn't - lots of porn!

Is that really so bothersome? I really think everyone has just had a knee-jerk reaction to this feature - it is actually quite logical to me, no politics involved...

Reply Parent Score: 2