Big data comes to the translation sector

Source: Common Sense Advisory
Story flagged by: Lucia Leszinsky

In our 2011 report on trends in machine translation (MT), we found that “the statistics-based approach to MT is basically a big-data application” (see “Trends in Machine Translation,” Oct11). We predicted that experts would apply these algorithms to crack inter-language communication and marketing issues as they processed more languages and huge volumes of multilingual content. We wrote that the use of such techniques would increase “both quality and understanding of how different languages affect perception and behavior.”

This week SDL coined the term “big language” to describe the intersection of many languages and mushrooming content volumes. The company stated that these two forces have combined to transform “Big Data challenges into Big Language challenges.” Besides introducing the term, the company announced that its Language Platform, which includes workflow and translation productivity solutions SDL WorldServer, TMS, and Studio, has been enhanced to support its BeGlobal machine translation software. It also added a self-training tool that will let in-house corporate and language service providers (LSP) teams improve MT quality and increase translator productivity. More >>

Subscribe to the translation news daily digest here.
See more translation news.

Comments about this article



Translation news
Stay informed on what is happening in the industry, by sharing and discussing translation industry news stories.

All of ProZ.com
  • All of ProZ.com
  • Term search
  • Jobs
  • Forums
  • Multiple search