Machine learning algorithms used to decode and enhance human memory

Machine learning algorithms used to decode and enhance human memory

  • March 5, 2018
Table of Contents

Machine learning algorithms used to decode and enhance human memory

When it comes to brain measurements, the best recordings come from inside the cranium. But people—and institutional review boards—aren’t usually amenable to cracking open skulls in the name of science. So Kahana and his colleagues collaborated with 25 epilepsy patients, each of whom had between 100 and 200 electrodes implanted in their brain (to monitor seizure-related electrical activity).

Kahana and his team piggybacked on those implants, using the electrodes to record high-resolution brain activity during memory tasks.

Source: wired.com

Share :
comments powered by Disqus

Related Posts

Machine Learning Workflow on Diabetes Data : Part 01

Machine Learning Workflow on Diabetes Data : Part 01

This article will portray how data related to diabetes can be leveraged to predict if a person has diabetes or not. More specifically, this article will focus on how machine learning can be utilized to predict diseases such as diabetes. By the end of this article series you will be able to understand concepts like data exploration, data cleansing, feature selection, model selection, model evaluation and apply them in a practical way.

Read More
Hacking the Brain with Adversarial Images

Hacking the Brain with Adversarial Images

This is an example of what’s called an adversarial image: an image specifically designed to fool neural networks into making an incorrect determination about what they’re looking at. Researchers at Google Brain decided to try and figure out whether the same techniques that fool artificial neural networks can also fool the biological neural networks inside of our heads, by developing adversarial images capable of making both computers and humans think that they’re looking at something they aren’t.

Read More
So what’s new in AI?

So what’s new in AI?

I graduated with a degree in AI when the cost of the equivalent computational power to an iPhone was $50 million. A lot has changed but surprisingly much is still the same.

Read More