Universal Sentence Encoder by Ray Kurzweil’s Team at Google

Universal Sentence Encoder by Ray Kurzweil’s Team at Google

  • March 31, 2018
Table of Contents

Universal Sentence Encoder by Ray Kurzweil’s Team at Google

We present models for encoding sentences into embedding vectors that specifically target transfer learning to other NLP tasks. The models are efficient and result in accurate performance on diverse transfer tasks. Two variants of the encoding models allow for trade-offs between accuracy and compute resources.

For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance. Comparisons are made with baselines that use word level transfer learning via pretrained word embeddings as well as baselines do not use any transfer learning. We find that transfer learning using sentence embeddings tends to outperform word level transfer.

With transfer learning via sentence embeddings, we observe surprisingly good performance with minimal amounts of supervised training data for a transfer task. We obtain encouraging results on Word Embedding Association Tests (WEAT) targeted at detecting model bias. Our pre-trained sentence encoding models are made freely available for download and on TF Hub.

Source: arxiv.org

Tags :
Share :
comments powered by Disqus

Related Posts

AI Cardiologist Aces Its First Medical Exam

AI Cardiologist Aces Its First Medical Exam

When both the AI and expert cardiologists were asked to classify the images, the AI achieved an accuracy of 92 percent. The humans got only 79 percent correct.

Read More
Baidu shows off its instant pocket translator

Baidu shows off its instant pocket translator

The Chinese Internet giant has made significant strides improving machine language translation since 2015, using an advanced form of artificial intelligence known as deep learning, said Hua Wu, the company’s chief scientist focused on natural-language processing. On stage, the Internet-connected device was able to almost instantly translate a short conversation between Wu and senior editor Will Knight. It easily rendered Knight’s questions including ’Where can I buy this device?’

Read More