ONNX expansion speeds AI development

ONNX expansion speeds AI development

  • May 4, 2018
Table of Contents

ONNX expansion speeds AI development

Facebook helped develop the Open Neural Network Exchange (ONNX) format to allow AI engineers to more easily move models between frameworks without having to do resource-intensive custom engineering. Today, we’re sharing that ONNX is adding support for additional AI tools, including Apple Core ML converter technology, Baidu’s PaddlePaddle platform, and Qualcomm SNPE.

Source: facebook.com

Share :
comments powered by Disqus

Related Posts

Facebook Open Sources ELF OpenGo

Facebook Open Sources ELF OpenGo

Inspired by DeepMind’s work, we kicked off an effort earlier this year to reproduce their recent AlphaGoZero results using FAIR’s Extensible, Lightweight Framework (ELF) for reinforcement learning research. The goal was to create an open source implementation of a system that would teach itself how to play Go at the level of a professional human player or better. By releasing our code and models we hoped to inspire others to think about new applications and research directions for this technology.

Read More
What tech calls “AI” isn’t really AI

What tech calls “AI” isn’t really AI

First, the problem itself is poorly defined: what do you mean by intelligence? Nature, with all her blind hideous strength, endless experimentation and wild wastes of infinite time, has only managed the trick once (by our narrow definition), with one species of tree-ape on a rolling green world. Even if you believe there’s intelligent biological life elsewhere, the stats aren’t promising.

Read More
Measuring the Intrinsic Dimension of Objective Landscapes

Measuring the Intrinsic Dimension of Objective Landscapes

In our paper, Measuring the Intrinsic Dimension of Objective Landscapes, to be presented at ICLR 2018, we contribute to this ongoing effort by developing a simple way of measuring a fundamental network property known as intrinsic dimension. In the paper, we develop intrinsic dimension as a quantification of the complexity of a model in a manner decoupled from its raw parameter count, and we provide a simple way of measuring this dimension using random projections. We find that many problems have smaller intrinsic dimension than one might suspect.

Read More