AI chip startup Wave to buy Silicon Valley old-timer MIPS

AI chip startup Wave to buy Silicon Valley old-timer MIPS

  • June 15, 2018
Table of Contents

AI chip startup Wave to buy Silicon Valley old-timer MIPS

Wave Computing has been aiming its AI chips at powerful computers, but acquiring MIPS will let it reach small devices, too. Call it a May-December marriage: A hot new AI chip startup called Wave Computing has acquired a veteran in the processor business, MIPS Technologies, CNET has learned. Wave, founded in 2010, aims to speed up the artificial intelligence technology with its custom processors.

But those chips are geared for companies with a big appetite for in-house AI work or for the data centers packed with computers that often supply computing power to others. By acquiring MIPS, Wave has a chance to disperse its technology from those central areas to the much wider world of devices in our lives. Wave didn’t comment for this story.

But it plans to announce the acquisition, a source familiar with the move said. MIPS, founded in 1984, has a long history in Silicon Valley. One of its founders was John Hennessy, who just received the prestigious Turing Award for being one of the inventors of a chip technology called RISC, short for reduced instruction set computing.

After MIPS, Hennessy became president of Stanford University and is now chairman of Google. MIPS, meanwhile, was acquired by Silicon Graphics and later Imagination Technologies but has been independent since late 2017.

Source: cnet.com

Tags :
Share :
comments powered by Disqus

Related Posts

AI Nationalism

AI Nationalism

The last few years have seen developments in machine learning research and commercialisation that have been pretty astounding. As just a few examples: Image recognition starts to achieve human-level accuracy at complex tasks, for example skin cancer classification. Big steps forward in applying neural networks to machine translation at Baidu, Google, Microsoft etc.

Read More
Why do neural networks generalize so poorly?

Why do neural networks generalize so poorly?

Deep convolutional network architectures are often assumed to guarantee generalization for small image translations and deformations. In this paper we show that modern CNNs (VGG16, ResNet50, and InceptionResNetV2) can drastically change their output when an image is translated in the image plane by a few pixels, and that this failure of generalization also happens with other realistic small image transformations. Furthermore, the deeper the network the more we see these failures to generalize.

Read More
Horovod: Distributed Training Framework for TensorFlow, Keras, and PyTorch

Horovod: Distributed Training Framework for TensorFlow, Keras, and PyTorch

Horovod is a distributed training framework for TensorFlow, Keras, and PyTorch. The goal of Horovod is to make distributed Deep Learning fast and easy to use.

Read More