Turning a million-qubit quantum computing dream into reality
James Clarke believes that quantum computing will only become practical when industry manufactures chips stuffed with more than a million error-corrected quantum bits.
The goal of creating a quantum system with so many qubits isn’t unique to any one company — IBM, Google, and startups like PsiQuantum have all announced plans to build such grandiose machines — but Clarke, director of quantum hardware at Intel believes the semiconductor giant has a unique advantage in making this reality possible through its manufacturing-focused approach to development.
In a peer-reviewed research paper published earlier this year, Intel claims to have successfully fabricated more than 10,000 dies, each with three to 55 quantum dots, on a 300 millimeter wafer with over 95% efficiency. The milestone, which the chipmaker took in partnership with Dutch research institute QuTech, represented a significantly higher yield and qubit count than universities and labs, including those used by other companies. , have reached to date.
Clarke says achieving such a feat was not trivial, made possible in large part by the fact that Intel, unlike most other companies pursuing quantum goals, runs its own fabs, which the company has also used to fabricate the necessary control logic that allows such a density of qubits.
“What we’ve done is we’ve taken the university-style approach to fabricating qubits, and we’ve used the tools from our toolkit from our advanced transistor factory to fabricate these devices with very high uniformity, very high yield and good performance,” says Clarke The next platform.
When Intel began its quantum efforts in 2015 with QuTech, which is partnered with Delft University of Technology in the Netherlands, the two organizations explored several ways to fabricate qubits. One promising avenue was the superconducting qubit, which saw the company produce a 17-qubit superconducting test chip in 2017.
But Clarke says Intel and QuTech have finally found greater capabilities with spin qubits, which involves “encoding zero or one of the qubits into the spin of a single electron.” Each of these electrons are “essentially trapped in the channel of what looks like a transistor”, which is why the chipmaker was able to use its transistor factories to make these types of quantum chips.
The decision to forgo the superconducting qubit route, which other organizations are taking, has apparently paid off, according to Clarke, because Intel’s spin qubits are “about a million times smaller.”
“So while we’re not where we are today, in the future we think we can scale much faster, get to have a much higher density of qubits in our devices,” he says.
The possibility of packing 10,000 arrays of spin qubits into a single wafer has an exciting implication for Clarke, even if it is currently theoretical.
“If we were to produce many of these wafers – or should I say, when we do, when we do it regularly – if we tested them all, we would have created more qubits on these wafers than any company never created one in life. of their experiments. That would be my guess,” he says. “Universities make them, and their research labs, they make a few at a time. Even in superconducting space, I think the number would be much smaller.
The other benefit Intel gets from making its own quantum chips is that, like the other chips it develops, it can run statistical analyzes to make further improvements.
“We can feed this information back to our factory to make better devices. We can then select the best devices at this stage and pass them on for further testing. So by having a slice full of devices, we really get a huge amount of data, which actually allows us to go a lot faster,” Clarke explains.
Even if it allows Intel to go faster, Clarke thinks the industry is still about a decade away from having a quantum computer that can be used for practical purposes, in areas such as cryptography, optimization, chemicals, materials and finance. It may seem like a long time, but put into perspective with other technologies developed by Intel, the timeline doesn’t seem out of place.
“If you look at the timeline from the first transistor, the first integrated circuit, and the first microprocessor, those timelines tend to happen over a period of 10 to 15 years. And so, in every big breakthrough that Intel has made – high-k metal gate, tri-gate – it’s all happened on a decade-like timeline. That’s not to say people can’t grow faster, but they are hard things to do. Quantum is more difficult than making a transistor. So why should we expect this to happen faster than a typical technology development cycle? ” he says.