Train Your Machine Learning Models on Google’s GPUs for Free

Train Your Machine Learning Models on Google’s GPUs for Free

  • March 16, 2018
Table of Contents

Train Your Machine Learning Models on Google’s GPUs for Free

Training your model is hands down the most time consuming and expensive part of machine learning. Training your model on a GPU can give you speed gains close to 40x, taking 2 days and turning it into a few hours. However, this normally comes at a cost to your wallet.

Source: hackernoon.com

Tags :
Share :
comments powered by Disqus

Related Posts

Deep Neural Network implemented in pure SQL over BigQuery

Deep Neural Network implemented in pure SQL over BigQuery

In this post, we’ll implement a deep neural network with one hidden layer (and ReLu and softmax activation functions) purely in SQL. The end-to-end steps for neural network training including the forward pass and back-propagation will be implemented as a single SQL query on BigQuery. As it runs on Bigquery, in effect we are performing distributed neural network training on 100s to 1000s of servers.

Read More
Using Google Cloud AutoML to Classify Poisonous Australian Spiders

Using Google Cloud AutoML to Classify Poisonous Australian Spiders

Google’s new Cloud AutoML Vision is a new machine learning service from Google Cloud that aims to make state of the art machine learning techniques accessible to non-machine learning experts. In this post I will show you how I was able, in just a few hours, to create a custom image classifier that is able to distinguish between different types of poisonous Australian spiders. I didn’t have any data when I started and it only required a very basic understanding of machine learning related concepts.

Read More