Du er ikke logget ind
Beskrivelse
Three paradigms have dominated machine translation (MT)—rule-based machine translation (RBMT), statistical machine translation (SMT), and example-based machine translation (EBMT). These paradigms differ in the way they handle the three fundamental processes in MT—analysis, transfer, and generation (ATG). In its pure form, RBMT uses rules, while SMT uses data. EBMT tries a combination—data supplies translation parts that rules recombine to produce translation.
Machine Translation compares and contrasts the salient principles and practices of RBMT, SMT, and EBMT. Offering an exposition of language phenomena followed by modeling and experimentation, the text:
Introduces MT against the backdrop of language divergence and the Vauquois trianglePresents expectation maximization (EM)-based word alignment as a turning point in the history of MTDiscusses the most important element of SMT—bilingual word alignment from pairs of parallel translationsExplores the IBM models of MT, explaining how to find the best alignment given a translation pair and how to find the best translation given a new input sentenceCovers the mathematics of phrase-based SMT, phrase-based decoding, and the Moses SMT environmentProvides complete walk-throughs of the working of interlingua-based and transfer-based RBMTAnalyzes EBMT, showing how translation parts can be extracted and recombined to translate a new input, all automaticallyIncludes numerous examples that illustrate universal translation phenomena through the usage of specific languagesMachine Translation is designed for advanced undergraduate-level and graduate-level courses in machine translation and natural language processing. The book also makes a handy professional reference for computer engineers.
Print Versions of this book also include access to the ebook version.