Intel’s New Path to Quantum Computing

Intel’s New Path to Quantum Computing

  • June 15, 2018
Table of Contents

Intel’s New Path to Quantum Computing

Intel’s director of quantum hardware, Jim Clarke, explains the company’s two quantum computing technologies The limits of Tangle Lake’s technology Silicon spin qubits and how far away they are The importance of cryogenic control electronics Top quantum computing applications What problems keeps him up at night AI vs. Quantum Computing: which will be more important? IEEE Spectrum: What’s special about Tangle Lake?

Jim Clarke:I can’t underscore, for these systems, how important the packaging is. Typically we make our computers to run at room temperature, in our back pocket or on our wrist or slightly higher temperatures, but never at a fraction of a degree above absolute zero [as you need for superconducting qubits]. So these guys developed a package that could withstand the temperatures mechanically and still be relatively clean from a signal perspective.

IEEE Spectrum: Is there a limit to the density of qubits using thetechnology in Tangle Lake? Thosepinouts look pretty big. Clarke:

I think already this is the largest chip-to-[printed circuit board] attachment that Intel has ever done. So any larger than this on a single piece—the coefficient of thermal expansion and shrinkage would be severe. Not only that, as you see, the actual connectors have a very large footprint, and those are important right now.

We can go bigger (more qubits per chip) with this technology, but not by much. So, what we’ll do is work to make the qubits smaller andthe connections smaller. And so, within the same size footprint, we can maybe increase the number of qubits by several factors.

But it’s hard to reach a place with that technology where you’d have the millions of qubits you would need to do something really life altering.

Source: ieee.org

Share :
comments powered by Disqus

Related Posts

The 50 Best Free Datasets for Machine Learning

The 50 Best Free Datasets for Machine Learning

What are some open datasets for machine learning? We at Gengo decided to create the ultimate cheat sheet for high quality datasets. These range from the vast (looking at you, Kaggle) or the highly specific (data for self-driving cars).

Read More
Training a neural network in phase-change memory beats GPUs

Training a neural network in phase-change memory beats GPUs

Compared to a typical CPU, a brain is remarkably energy-efficient, in part because it combines memory, communications, and processing in a single execution unit, the neuron. A brain also has lots of them, which lets it handle lots of tasks in parallel. Attempts to run neural networks on traditional CPUs run up against these fundamental mismatches.

Read More
AI winter is well on its way

AI winter is well on its way

Deep learning has been at the forefront of the so called AI revolution for quite a few years now, and many people had believed that it is the silver bullet that will take us to the world of wonders of technological singularity (general AI). Many bets were made in 2014, 2015 and 2016 when still new boundaries were pushed, such as the Alpha Go etc. Companies such as Tesla were announcing through the mouths of their CEO’s that fully self driving car was very close, to the point that Tesla even started selling that option to customers [to be enabled by future software update].

Read More