Note: My intentions with this article are not to bash a particular company or product. It is just my commentary about a subject I felt needed to be talked about. I tried my best to stay unbiased towards or against any company or product when I wrote this piece. As Quantum Computing transitions from being purely theoretical to a subject that has a vibrant practical aspect to it, we need to think about the tools for doing the job. Before I go on with the central theme of this blogpost, let me address a key issue in quantum computing education. It is my opinion, that the best way to begin learning quantum computing (at the time of this being published) is either by a book(1,2,3) or a course (traditional or online) that primarily focuses on the core theoretical principals with linear algebra in all its glory. SDKs and APIs are supplementary to the core subject (at least when you begin your journey). The problem with learning quantum computing with an SDK as the focus, is that your idea of what a quantum computer can do will be filtered through the lens of what the SDK can do and especially what the hardware behind the SDK can do. Maybe this paradigm can be revised in the future. Anyways, back to the main topic. As different hardware vendors emerge in the market, we also get new software tools to access these machines. In other words, it is not just the wild west era for quantum hardware, but an equally chaotic era in the realm of quantum SDKs and frameworks. We have the OCEAN SDK (and D-wave SAPI) for D-wave quantum annealers, QISKIT for IBM machines, Cirq for google machines, QDK (and Q#) for Microsoft’s effort, Forest SDK from Rigetti and many more to follow… Most of these tools are made under an open-source license (eg : 4,5,6,7) either fully or in part. These are also being (very smartly) branded in a way that deemphasizes the specifics of the backend, although it is obvious that QISKIT is primarily to support IBMQ, Cirq for google and so on. Theoretically, all the companies can unite behind one particular open-source project and make their machines work through it. But then, they may lose out on their USP . However, if you look at things from a comp sci researcher/developer's point of view, these SDKs are making it more challenging for people learn to code on those machines. It is a good thing that most of these vendors have agreed on a common language : Python. Microsoft is the biggest exception with Q# (Although QDK has made Q# and Python interoperable). It is a no-brainer that as hardware for quantum computing is standardized to 1 or 2 forms, so will the software tools. However, for the time being, I feel that learning at least 2 of these SDKs really well is important. After that, its all as per your project requirements. But the entire situation is further complicated by a concept originally designed to alleviate the above problem: Hardware-agnostic quantum platform (lets call them HAQP for this article), designed to cut out the extra steps from the process. It is a good idea, which caused a multitude of organizations to jump into the fray with their offerings : XACC (by ORNL), 1QLOUD (by 1Qbit), QC WARE’s cloud platform, Black Opal (by Q-Ctrl) among several others. This raised another question in my mind, as someone who is building a career in Quantum Information Sciences, which one should I go with? Please do note that I am not advocating against the existence of choice(s) for the customer, that’s a good thing. I am just trying to make sense of them, mostly for my own self. I suppose we can start with the question: what differentiates your tool from the rest? Black Opal seems to be going for noise characterization, decoherence reduction etc. QC Ware’s platform is targeted more for enterprise use cases. These are just some examples of potential answers for the above question. Personally, as a PhD student who currently has some freedom of choosing his projects and tools, I don’t have a current need for a hardware agnostic platform for my projects. But if you have such a requirement: take a look at XACC first (since its free) if its good enough, or take a look the other HAQPs out there if you have specific requirements. A lot of other things can be said about this topic. But this is my brief take on it. Let me know if you have any feedback. For a more comprehensive list of the kinds of tools available in the field, I would encourage you to check this out.
2 Comments
Note: This blog post is about ideas and musings I have regarding quantum optimization (my 2 cents) with the aim that you may be able to use some of them in your own research. This is neither an introductory not a very advanced level technical post. Some of it may be common knowledge to you (if it is, do let me know), in which case I hope that these ideas will help new researchers just entering the field.
One of the areas where quantum computing is expected to make an impact is in the field of discrete optimization. The two most promising approaches to do this task are Quantum Annealing (QA) that require a machine like the D-wave and the Quantum Approximate Optimization Algorithm (QAOA) for the gate based quantum computers. The process of solving your problem by either of these approaches is essentially to : (1) convert your problem into an Ising Model problem, (2) embed it on to the hardware of the computer, (3) run QA or QAOA multiple times (probabilistic), (4) get the results and maybe post-process the results a bit to optimize it further. Since the advent of these approaches, people have been mapping the use cases (problems) of their interest into the Ising Model with the hope that quantum computing may provide some sort of advantage when it comes to solving those problems. However, while comparing the results with the classical methods/heuristic solvers, I believe that researchers are often overlooking one important detail: the classical methods are using properties inherent to their specific problems. The quantum approaches (most often) do not. In other words, QA and QAOA are being used as a one-size-fits-all kind of approach for problems. In their 2014 paper, Farhi et al. did a comparison of the MAXCUT problem on 3 regular graph with the best classical approximation algorithm, but QAOA was not modified in any way to adapt to that specific problem. The field of classical computation is backed by years of research in Approximate and Parametrized algorithms. Practical quantum computing is a field that became possible (and popular) starting in this decade (D-wave one was released in 2011, and the paper on QAOA was published in 2014). In the hype of quantum computing maybe being able to solve NP-Hard problems (unlikely) or at least beat classical methods, I think we got a bit to greedy and rushed forward to try it for every suitable problem under the sun (and we are still in the process). So it is fair to say that we have much to learn on how to efficiently harness this new technology. I guess what I am trying to say is, maybe we can be better in the way we use QA and QAOA for solving our domain specific problems. The two steps we can take at the very least are (1) Tinker with the input and (2) Tinker with the process Tinker with the Input : This may have been obvious to a lot of people (but it still needs to be said as a reminder): All Ising formulations are not created equal (with regards to a common problem). This became apparent when I saw various works on the D-wave (and QAOA) about Integer Factorization (1 ,2 3 ). If it was possible to evaluate the D-wave for Integer Factorization with just one work, we wouldn’t need so many papers that focus on various Ising formulations. The point I am trying to emphasize is: it is important to focus on how to best translate your domain problem into an Ising Formulation. Tinker with the Process : This is harder to do, because of the way QA and QAOA are set up. However, we can still do a few things. Classical post-processing is often used on the output of quantum optimizers to get some ‘extra mileage’ out of the results. Maybe we can focus on creating classical postprocessing specific to the domain of the problem. For example, there was an effort to try and use the D-wave quantum annealer for linear systems of equations and least squares (4,5,6,7,8)(not NP-Hard, but I believe this point still stands!). Now, since the energy landscape of those problems is convex in nature, maybe we can use the best result from the quantum annealer as a starting point towards the optimal solution and the other results (since it is a set of result) to figure out some sort of gradient that will be useful for convergence. This is a particular idea may or may not work, but the general idea is to leverage the knowledge that we have about our problem. There are other approach specific things we can do : For Quantum Annealing : This cannot be emphasized enough, after you have ascertained the baseline performance for your problem on the quantum annealer (say D-wave), it is important to play with the parameters like gauge transformation, various types of chain resolutions, beta parameters, etc to see what kind of effect does it have on the answers. Better yet, having a theory behind why certain parameter values might be better for a problem (with empirical proof) will bring more credibility. For QAOA : Because QAOA is technically a hybrid quantum – classical algorithm which requires angles (found separately) in order to approximate the solutions, we can pay special attention to the process that we use to find these angles (like Bayesian Optimization, Grid Search, Variational Quantum Eigensolver etc). For certain problems grid search may be good enough but for others we may need Bayesian Optimization or even VQE. These are some of the ideas I could think of at this time. Of course, they all revolve around the QA and QAOA (also Coherent Ising Machines , if you consider it as ‘quantum’) methods. Let me know your thoughts regarding this blogpost. Since this is my first (serious) blog. I’ll be coming back and updating it to make it better. |
Ajinkya Borle
Grad Student in UMBC. ArchivesCategories |