The Evolution of Machine Translation: A 90-Year Journey

It may come as a surprise that machine translation (MT) has been around for more than 90 years. During this time, machine translation technology has gone through multiple models, changes, and developments to be what it is today. And with emerging technologies like ChatGPT by OpenAI, the future of machine translation has never been more exciting. In this blog, we’ll look at some of the highlights along the history of machine translation and share some information about our MT services.
The Evolution of Machine Translation: A 90-Year Journey

A Brief History of Machine Translation

1930s

The First Machine Translation Patents

The first concrete machine translation patents were issued in 1933 in France by George Artsrouni, a French-Armenian, and in Russia by Petr Smirnov-Troyanskii. Artsrouni designed a paper-tape storage device that could locate the equivalent of a word in one language into another and demonstrated a prototype in 1937.
Patented
Troyanskii believed that “the process of logical analysis could itself be mechanized.” He pictured a three-part mechanical translation process and designed a machine to take on the second part, which converted sequences of base forms and functions into the equivalent in other languages.
Machine Translation with Computers

1949s

Machine Translation with Computers

The next big development in machine translation was in 1949 when Warren Weaver introduced various statistical techniques to use computers to translate text from one language to another. As a result, statistical machine translation (SMT) was born.

Rule-Based Machine Translation

The next few decades were surrounded by rule-based machine translation (RBMT), based on defined linguistic rules in source and target languages. Rules include things such as predefined grammar, syntax, dictionaries, and vocabulary lists. RBMT is considered the “classical” approach and was used until the next developments in MT.

1980s and early 1990s

Corpus-Based Approaches: Statistical Machine Translation and Example-Based Machine Translation

After RBMT was developed and used for a few decades, researchers turned to corpus-based approaches, where a collection of written or spoken content is stored for analysis and translated from one language to another. Two main styles of corpus-based MTs were created:
  • Example-based machine translation (EBMT) - based on phrases or short texts.
  • Statistical machine translation (SMT) - based on word frequency and word combinations.
1980s and early 1990s Corpus-Based Approaches
These models did away with rules and focused on identifying the most likely translation options, resulting in significantly better results than RBMT.

During this time, some significant online milestones occurred:
  • In 1992, the first online machine translation was launched.
  • In 1995, SYSTRAN released the first web-based MT tool.
  • In 1997, search engine AltaVista's BabelFish, powered by SYSTRAN, provided real-time translations on the internet. This is the first time that lay people had access to an MT tool.

2000s

Neural Machine Translation

Neural machine translation (NMT) was developed in 2003 at the University of Montreal. NMT is a language model used for natural language processing based on a large neural network to model the translation process. In 2014, a sequence-to-sequence (Seq2Seq) model was developed for NMT, which challenged the SMT model. After Seq2Seq, NMT became the state-of-the-art MT tool.
2000s Neural Machine Translation
Google Translate was developed in 2006 using SMT and was used from 2007-2016. Then Google Translate built on to NMT, creating Google Neural Machine Translation (GNMT) in 2016 for nine languages. GNMT used recurrent neural networks (RNNs) to learn mapping between sentences and phrases, resulting in fewer engineering designs. In addition, it divided words, and not just sentences, to try to find the translation.

And Now

Emerging Technologies

Emerging Technologies
In the last few years, technologies around language models built on large-scale neural networks have sprung up. ChatGPT, probably the most well-known of these technologies, is an AI model developed by OpenAI. The model is designed to be conversationally interactive and engaging. These technologies are relatively new, and just like anything else, will take time to refine and improve.
Machine translation has come a long way since its inception more than 90 years ago. With the way that technology is rapidly advancing, it’s just a matter of time before the next big developments occur.

So, how can your business benefit from machine translation today?

Our Machine Translation Services

Our Machine Translation Services

At MondragonLingua, we provide machine translation services for all kinds of companies and organizations. We offer two types of machine translation:

  • Post-edited MT
    Machine translation software generates a translation from the content, and then it goes through a human post-edit to improve the quality and accuracy. Machine translation post editing is best when content needs to be of high quality.
  • Raw MT
    Raw MT is used when content doesn’t need to be as accurate and can be suitable for internal documents, getting the general idea of something, etc.

If you'd like to learn more about our machine translation services, please don't hesitate to contact us for expert guidance

Leave a Reply

Subscribe to our Newsletter

Recent Posts

Follow Us

MONDRAGON Corporation

Sign up for our Newsletter

Don’t miss a thing, sign up for our newsletter

MondragonLingua is your home for professional translation services
+ 1 201 343 0015

Humanity At Work​

Call us and tell us about your needs and we will find the solution best suited to you.

+1 201 343 0015

Where can we reach you?