Linked by Thom Holwerda on Mon 5th Feb 2018 23:04 UTC, submitted by Morgan
Google

Such a development would cause a soul-shattering upheaval in my mental life. Although I fully understand the fascination of trying to get machines to translate well, I am not in the least eager to see human translators replaced by inanimate machines. Indeed, the idea frightens and revolts me. To my mind, translation is an incredibly subtle art that draws constantly on one's many years of experience in life, and on one's creative imagination. If, some "fine" day, human translators were to become relics of the past, my respect for the human mind would be profoundly shaken, and the shock would leave me reeling with terrible confusion and immense, permanent sadness.

As a translator myself, I can indeed confirm Google Translate is complete and utter garbage, but the idea that I would "mourn" the end of translators seems outlandish to me. The unstoppable march of technology has eliminated countless jobs over the course of human existence, and if translators are next, I don't see any reason to mourn the end of my occupation. Of course, it'd suck for me personally, but that's about it.

That being said, I'm not afraid of running out of work any time soon. Google Translate's results are pretty terrible, and they only seem to be getting worse for me, instead of getting better. There's no doubt in my mind that machine translation will eventually get good enough, but I think it'll take at least another 20 years, if not more, to get there.

Thread beginning with comment 653551
To read all comments associated with this story, please click here.
fabrica64
Member since:
2013-09-19

ML, AI and "neural networks" just try to learn and decide from experience. In other words learning from statistical analysis.

Do anyone really think that statistical analysis can translate emotions? That can decide if something is beautiful or ugly? Computers are deterministic machines, have no notion about emotion and will never have.

Edited 2018-02-06 00:45 UTC

Reply Score: -1

agentj Member since:
2005-08-19

These automated translators are complete failure and they have so many poor quality translations or complete opposite meaning - until recently GT translated "배불러요" (I'm full) in Korean as "I'm hungry", or "내꺼야" (mine as for belonging) into Polish "kopalnia" - which has meaning of underground mine. Until we develop fully sentient AI which can perform on at least of the human level, I think human translators are pretty safe.

Edited 2018-02-06 03:29 UTC

Reply Parent Score: 1

Morgan Member since:
2005-06-29

or "내꺼야" (mine as for belonging) into Polish "kopalnia" - which has meaning of underground mine.


That sounds suspiciously like it translated the Korean into English and then into Polish, and since "mine" is a homonym in English for both possession and underground tunnels, it perhaps randomly chose the latter meaning for the final translation.

I have noticed that both Google and Bing translators work better the more context/sentence structure you provide for them. Perhaps we need a mode where it asks you for context instead of just randomly offering a word-to-word translation when all you provide is one word.

And speaking of Bing, it still does what you indicated Google used to: "내꺼야" becomes "Kopalni".

Reply Parent Score: 5

Gargyle Member since:
2015-03-27

... implying that human minds are NOT deterministic machines?

How can you be so sure?

A bold claim.

Reply Parent Score: 5

fabrica64 Member since:
2013-09-19

I'm not sure, but I am also not sure human mind IS deterministic and can be emulated by computers. This is something we, humans, love to debate (as computer would never do :-)) and I am not seeing this stopping any time soon.

But I would bet (and feel) it's not deterministic

Reply Parent Score: -1

Carewolf Member since:
2005-09-08

... implying that human minds are NOT deterministic machines?

How can you be so sure?

A bold claim.

Humans are Turing complete and thus at least our termination is indeterminable.

Reply Parent Score: 2