EGG: A toolkit for language emergence simulations with neural networks

EGG: A toolkit for language emergence simulations with neural networks

  • August 4, 2019
Table of Contents

EGG: A toolkit for language emergence simulations with neural networks

EGG is a new toolkit that allows researchers and developers to quickly create game simulations in which two neural network agents devise their own discrete communication system in order to solve a task together. For example, in one of the implemented games, one agent sees a handwritten digit and has to invent a communication code to tell the other agent which number it represents. A lively area of machine learning (ML) research, language emergence would benefit from a more interdisciplinary approach.

However, the barrier to entry is high, as modern discrete-communication games require expertise in advanced areas of ML, such as deep sequence-to-sequence modeling and reinforcement learning. EGG lowers this barrier and makes it possible for scientists, engineers, and hobbyists with basic programming skills to design and test new games.

Source: fb.com

Tags :
Share :
comments powered by Disqus

Related Posts

Using natural language processing to manage healthcare records

Using natural language processing to manage healthcare records

The next time you see your physician, consider the times you fill in a paper form. It may seem trivial, but the information could be crucial to making a better diagnosis. Now consider the other forms of healthcare data that permeate your life—and that of your doctor, nurses, and the clinicians working to keep patients thriving.

Read More
Mapping roads through deep learning and weakly supervised training

Mapping roads through deep learning and weakly supervised training

Creating accurate maps today is a painstaking, time-consuming manual process, even with access to satellite imagery and mapping software. Many regions — particularly in the developing world — remain largely unmapped. To help close this gap, Facebook AI researchers and engineers have developed a new method that uses deep learning and weakly supervised training to predict road networks from commercially available high-resolution satellite imagery.

Read More
Introducing EvoGrad: A Lightweight Library for Gradient-Based Evolution

Introducing EvoGrad: A Lightweight Library for Gradient-Based Evolution

Tools that enable fast and flexible experimentation democratize and accelerate machine learning research. Take for example the development of libraries for automatic differentiation, such as Theano, Caffe, TensorFlow, and PyTorch: these libraries have been instrumental in catalyzing machine learning research, enabling gradient descent training without the tedious work of hand-computing derivatives. In these frameworks, it’s simple to experiment by adjusting the size and depth of a neural network, by changing the error function that is to be optimized, and even by inventing new architectural elements, like layers and activation functions–all without having to worry about how to derive the resulting gradient of improvement.

Read More