IBM just completed its annual Quantum Summit. The company has accomplished several Quantum roadmap milestones and set down a new challenge. During these Summits, IBM updates the industry on its on-going efforts to make quantum computing a key part of the future of computing and sets goals for future developments. The theme of the summit was “The Next Wave,” as IBM believes quantum is rapidly approaching an inflection point. The summit was located this year in downtown Manhattan and attended by many IBM business partners including Boeing, Bosch, Vodaphone, and many others. It was also well attended by academics and government researchers.
The key to quantum computing acceptance and maturing as an industry will be building a healthy ecosystem and set of partnerships. IBM’s Quantum Network now numbers over 200 members. The company added new quantum innovation centers at Arizona State University, DESY, IIT Madras, and the newly opened uptownBasel innovation center in Switzerland. Its latest industry partners include Bosch, Crédit Mutuel Alliance Fédérale, Erste Digital, Tokyo Electron, HSBC, and Vodafone. IBM offers multiple offerings for its quantum program: including an open plan, a pay-as-you-go plan, as well as a premium plan. The company also offers a Quantum Accelerator program with support resources to businesses at any point on their journey to quantum readiness.
One of the new partners added to the IBM Quantum Network, is multinational telco company Vodafone, which is exploring quantum computing and quantum-safe cryptography as lead partner under the 3GPP consortium. Another partner is the French bank Crédit Mutuel which is exploring use cases in financial services. Also, uptownBasel is offering skill development and promoting leading innovation projects on quantum and high-performance computing technology.
IBM made 12 major announcements at the summit, which is difficult to condense into one story. One key hardware announcement was Osprey, IBM’s new 433-quantum bit (qubit) quantum processing unit (QPU) and the world’s largest superconducting quantum processor.
Overall, IBM is making continued improvements in quantum computers: Scale (more qubits), Quality (increased Quantum Volume factoring in coherence time and error rates), and Speed (CLOPs or circuit layer operations per second). The number of qubits triples with the introduction of the 433 qubit Osprey QPU. There’s a 4x improvement of quantum volume, going from 128 to 512. And there’s a 10x improvement in CLOPS from 1.4k to 15k — surpassing IBM’s goal of 10k CLOPS by 50 percent — both on its Falcon chips.
IBM also released full access to quantum dynamic circuits, those which incorporate classical computing within the duration of the circuit to perform a richer array of circuit operations. Dynamic circuits, among other advantages, will greatly reduce the length of certain quantum circuits, which makes useful quantum circuit design more practical.
To address the multi-quantum system scaling issues, IBM needed a new cryostat design. The company had previously announced the IBM Quantum System Two design, but as it gets closer to implementation that artist conception has changed and a new design was revealed at the summit. With the IBM Quantum System Two, multiple cryostats can be positioned next to each other to form a single system with communication links. IBM plans to have the System Two online by the end of 2023. With the next-generation system, IBM plans to build the next wave of quantum computing and will integrate middleware for quantum with quantum and classical workflows in a multicloud environment.
With the development of quantum computing technology, there’s also the threat that it can be used as a weapon. Specifically, quantum computing has a good chance to advance to the point where it can crack public-key encryption protocols using Shor’s algorithm. This future threat is also being addressed by standards organization US National Institute of Standards and Technology (NIST) in its next-generation encryption standards. IBM reminded the audience it has already deployed support for those next generation standards in its latest IBM Z16 mainframe computer.
IBM Fellow and VP of Quantum Jay Gambetta ended the list of announcements with a challenge he called “100×100.” The quote in his blog post: “In 2024, we plan to offer a tool capable of calculating unbiased observables of circuits with 100 qubits and 100 depth of gate operations in a reasonable runtime. We’re confident in our ability to deliver this tool thanks to Heron: If we can build a Heron processor with error rates below the “three-nines” gate fidelity threshold, plus the software infrastructure to read out the circuits in concert with classical resources , then we can run a circuit of 100×100 in less than a day and produce unbiased results. This system will be able to run quantum circuits with complexity and runtime beyond the capabilities of the best classical computers today.” At that threshold, IBM believes it will be able to demonstrate that quantum computers can solve problems that are impractical on classical computers, often called Quantum Advantage. Having QPUs with more than 100 nearly error-free qubits allows the implementation of deeper circuits and therefore more complex operations.
Even though the concept of a quantum computer has been in development for decades, only recently have we reached the point where we have enough qubits to make things interesting. A good part of it is still a science project. Tools are becoming more sophisticated and increasingly approachable. But it’s still an area looking for problems to solve. In addition, the pioneers of quantum computing still talk in terms that are not readily understood by mainstream programmers. We are roughly in the position that neural networks and deep learning processing were about ten years ago. We may be seeing a similar inflection point for quantum computing as it increases its capabilities. The real difference is that you can run AI processing on a vast range of computing devices from microcontrollers to supercomputers. Superconducting quantum computing needs specialized equipment that can only be supported at a data center scale. There are other types of quantum computing devices which may find applications in some areas, but for the highest speed and capability, IBM has made its bet on superconducting qubits.
IBM has been committed to quantum computing because, as Dario Gil, the company’s head of research told the audience at the beginning of the event, there are three general areas of computing: bits (classical), neurons (AI), and qubits (quantum ). Both AI and quantum will still require control and connections to classical computing and will not replace regular bits anytime soon. But both AI and quantum can provide specialized functions that are difficult or even impractical with classical computing. While AI has already established itself in the mainstream market, quantum is still a very nascent technology. There are some analogies between AI and quantum. For example, both provide probabilistic (stochastic) results – AI provides a probabilistically correct answer and quantum provides a probability distribution. The journey for AI to win mainstream acceptance did not happen overnight, but once it was recognized as a power tool, applications sprang up everywhere. Quantum is still on that path to acceptance and recognition of its power. Quantum computing is still seeking its niche. The application of quantum to real-world problems may be more limited today, but TIRIAS Research believes that quantum computing will help solve some of the hardest problems that classic computing can only approximate, and that AI can only guess at.
Tirias Research tracks and consults for companies throughout the electronics ecosystem from semiconductors to systems and sensors to the cloud. Members of the Tirias Research team have consulted for IBM and other companies throughout the Security, AI and Quantum ecosystems.