Translating Text
With the help of a large language model
Recently, I conducted a two-part experiment in the use of a large language model to translate a chapter from a book written in German more than a hundred years ago. The first half of this trial, which I described in a piece that I posted last month, employed ChatGPT to fix the mistakes that occurred when OCR (optical character reader) software converted the product of old-time printing into digital text. The second part of the test asked the first of these programs to convert the corrected text into readable English.
On the whole, ChatGPT did a reasonably good job of translating wooden, inelegant, and painfully academic German prose into wooden, inelegant, and painfully academic English prose. In particular, the program made no attempt to break up long sentences or use vibrant verbs to reduce reliance on the passive voice. In this respect, the product of artificial intelligence turned out to be far less accessible than a pen-and-paper translation of the chapter made by an American in the 1970s.
As might be expected, ChatGPT ran afoul of terms of art, whether those that belonged to the first decade of the twentieth century (in which the chapter was written) or those peculiar to the subject of the chapter (the work of a man active in the years between 1585 and 1625.) Thus, in this respect, the machine translation proved inferior to the work of a human being who, in addition to knowing a great deal about the subject at hand, had already decoded more than fifteen hundred pages of the author’s work.
On a happier note, the translation contained but one of the capricious substitutions I have come to expect from ChatGPT. Rather than translating the title of a book mentioned in the original text, the large language model translated the name of another work that had been written by the same person.
This seemingly gratuitous replacement may have resulted from an attempt by ChatGPT to double-check its own work. More specifically, I can imagine that, when the program realized that it was dealing with the title of a book, it searched through library catalogs for a mention of the work. However, when it failed to find the volume in question, it concluded that another product of the same pen was, in fact, the very thing that it was looking for.
Of course, it is also possible that I am far too kind to the man-made brain. Thus, I can also conjure a scenario in which a mischievous programmer decided to have some fun by teaching its algorithmic pupil the ancient art of the bait-and-switch.
For Further Reading:
To Subscribe, Support, or Share:






GIGO every time.
How long did it take to do?