A history of machine translation from the Cold War to deep learning

A history of machine translation from the Cold War to deep learning

  • March 13, 2018
Table of Contents

A history of machine translation from the Cold War to deep learning

I open Google Translate twice as often as Facebook, and the instant translation of the price tags is not a cyberpunk for me anymore. That’s what we call reality. It’s hard to imagine that this is the result of a centennial fight to build the algorithms of machine translation and that there has been no visible success during half of that period.

The precise developments I’ll discuss in this article set the basis of all modern language processing systems—from search engines to voice-controlled microwaves. I’m talking about the evolution and structure of online translation today.

Source: freecodecamp.org

Tags :
Share :
comments powered by Disqus

Related Posts

Semantic Image Segmentation with DeepLab in Tensorflow

Semantic Image Segmentation with DeepLab in Tensorflow

Today, we are excited to announce the open source release of our latest and best performing semantic image segmentation model, DeepLab-v3+ [1], implemented in Tensorflow. This release includes DeepLab-v3+ models built on top of a powerful convolutional neural network (CNN) backbone architecture [2, 3] for the most accurate results, intended for server-side deployment. As part of this release, we are additionally sharing our Tensorflow model training and evaluation code, as well as models already pre-trained on the Pascal VOC 2012 and Cityscapes benchmark semantic segmentation tasks.

Read More