Merging AI and Quantum Computing To Boost Drug Discovery
Quantum computing is a growing trend due to recent major advances in hardware making this technology a practically feasible thing. Some experts compare today’s rapid progress in quantum computing with a similar period during the last century when personal computing was emerging through a cascade of technological advances both in hardware and software.
Will quantum change everything? When? Are we there yet?
To find our insights about the above questions and learn a thing or two about what the quantum computing industry looks like today, I sat down with Dr. Christopher Savoie, Co-founder, and CEO at Zapata Computing -- an American quantum software company on the cutting-edge of research in this area.
Christopher is a scientist and a serial tech entrepreneur with 20+ years of experience in the technology industry. He is the inventor of the Natural Language Understanding (NLU) technology behind Apple's Siri and he was recognized as a top business leader and innovator by Nikkei and MIT Technology Review.
Christopher co-founded Zapata Computing out of Harvard in 2017, along with his colleagues Alán Aspuru-Guzik, Jhonathan Romero, Jonathan Olson, Peter Johnson, and Yudong Cao.
The below text of the interview has been modified for size. Watch the original video for the complete interview.
Andrii: I would like to first start with a very general question -- there are headlines involving top companies like Google, for example, about “quantum supremacy” and a lot of theoretical work has been done in the field. Now it seems like the industry is finally getting to some practical valuable applications. Can you explain where we are right now with this technology?
Christopher: We're in the early days of this technology and the hardware itself is still being developed as we speak. There are many types of ways to make these qubits so the theory that a qubit can work has already been proven experimentally. We know that qubits work, we have them, we have these computers, they can do relatively simple calculations right now -- so the theory behind it is not even a question.
The question is, however, which of the different ways of making a qubit work are going to be the most effective and the most scalable solutions. So, Google and IBM have this superconducting type of qubit and that has its advantages including the speed of the gate operations that it does. Then there are other types of these computers that have nothing to do with superconducting: they are done with ion traps and there are neutral atom ways of creating qubits and photonic ways of creating qubits… So all of these competing platforms are trying to create a hardware device that will perform this theory of superposition and entanglement and all of these things to do the wonderful calculations and, at the end of the day, become fault-tolerant, meaning that they are robust enough to continue in their operations and do calculations perfectly. So they are all doing the same calculation, the same thing but they do them in very different ways. There's a lot of development in each one of these areas so it is very nuanced and very hard to know from a user's perspective or from a pharmaceutical company’s or, for that matter, from any other company's perspective which one of these platforms is going to be ideal for any given problem or solution. We are in the early days of discovery and I am of the opinion that it is not a zero-sum game -- it is not going to be just one of these solutions that wins everything. I think all of these various platforms will be good for certain things, certain applications.
For example, an ion trap is right now of a higher fidelity -- the qubits have less error built-in them than the superconducting qubits that IBM or Google might have. So companies like IQM and Honeywell have these machines but their gate speeds, the speed of operation, is much slower. So there's kind of a trade-off here. If you want to have lots and lots of samples really quickly -- then the superconducting ones are going to be preferred -- as long as you can tolerate a little more error. Contrary, if you have something that needs that fidelity very much -- maybe the ion traps are a better solution. So the choice very much depends on the needs, on the use case. In 2019 Google released a paper, they said they accomplished something that would be difficult to do on a classical computer, even a supercomputer, they compared it to the Oak Ridge computer. It wasn't though a very particularly useful example that they did -- it is not like the problem that they solved could be directly applicable to chemistry or biology or anything that we might care about or even drug delivery or anything close to that -- but it showed that at 53 qubits you could do something that a classical computer would have a difficult time doing. So we are getting to that regime where we can see that there will be some applications and use cases where a quantum computer, empirically and experimentally -- not just theoretically, is going to be superior to classical computers.
We have shown in recent results that we can get enhancements to, let’s say, machine learning using a quantum computer. Even a relatively small quantum computer can give us some advantage in the way that we do neural networks and things like generative adversarial networks (GANs). That points to perhaps a quite near-term possibility that things that we care about, like neural networks and machine learning, maybe be enhanced by quantum computers, so we are at that point where this is starting to get very real.
Continue reading
This content available exclusively for BPT Mebmers
We use cookies to personalise content and to analyse our traffic.
You consent to our cookies if you continue to use our website. Read more details in our
cookies policy.