Train ALBERT for natural language processing with TensorFlow on Amazon SageMaker

At re:Invent 2019, AWSsharedthe fastest training times on the cloud for two popular machine learning (ML) models: BERT (natural language processing) and Mask-RCNN (object detection). To train BERT in 1 hour, we efficiently scaled out to 2,048 NVIDIA V100 GPUs by improving the underlying infrastructure, network, and ML framework.Today, we’re open-sourcing the optimized training codefor […]

Top 10 Best FREE Artificial Intelligence Courses

Most of the Machine Learning, Deep Learning, Computer Vision, NLP job positions, or in general every Artificial Intelligence (AI) job position requires you to have at least a bachelor’s degree in Computer Science, Electrical Engineering, or some similar field. If your degree comes from some of the world’s best universities than your chances might be […]

A highly efficient, real-time text to speech system deployed on CPUs

Modern text-to-speech (TTS) systems have come a long way in using neural networks to mimic the nuances of human voice. To generate humanlike audio, one second of speech can require a TTS system to output as many as 24,000 samples — sometimes even more. The size and complexity of state-of-the-art models require massive computation, which […]

The Hateful Memes AI Challenge

We’ve built and are now sharing a data set designed specifically to help AI researchers develop new systems to identify multimodal hate speech. This content combines different modalities, such as text and images, making it difficult for machines to understand. The Hateful Memes data set contains 10,000+ new multimodal examples created by Facebook AI. We […]

Ultimate Guide to Natural Language Processing Courses

Selecting an online course that will match your requirements is very frustrating if you have high standards. Most of them are not comprehensive and a lot of time spent on them is wasted. How would you feel, if someone would provide you a critical path and tell, what modules exactly and in which order will […]

Word2Vec: A Comparison Between CBOW, SkipGram & SkipGramSI

Learn how different Word2Vec architectures behave in practice. This is to help you make an informed decision on which architecture to use given the problem you are trying to solve. In this article, we will look at how the different neural network architectures for training a Word2Vec model behave in practice. The idea here is […]

The Dark Secrets Of BERT

BERT stands for Bidirectional Encoder Representations from Transformers. This model is basically a multi-layer bidirectional Transformer encoder(Devlin, Chang, Lee, & Toutanova, 2019), and there are multiple excellent guides about how it works generally, includingthe Illustrated Transformer. What we focus on is one specific component of Transformer architecture known as self-attention. In a nutshell, it is […]

10 Best Machine Learning Textbooks that All Data Scientists Should Read

Machine learning is an intimidating topic to tackle for the first time. The term encompasses so many fields, research topics and business use cases, that it can be difficult to even know where to start. To combat this, it’s often a good idea to turn to textbooks that will introduce you to the basic principles […]

The Best NLP Papers From ICLR 2020

I went through 687 papers that were accepted to ICLR 2020 virtual conference (out of 2594 submitted – up 63% since 2019!) and identified 9 papers with the potential to advance the use of deep learning NLP models in everyday use cases. Source: topbots

A Hacker’s Guide to Efficiently Train Deep Learning Models

Three months ago, I participated in a data science challenge that took place at my company. The goal was to help a marine researcher better identify whales based on the appearance of their flukes. More specifically, we were asked to predict for each image of a test set, the top 20 most similar images from […]