부산시청 도서요약
   글로벌 트렌드내서재담기 


  • Preparing for the Quantum Computing Revolution

    Despite the relentless advance of Moore’s law and complementary advances in software over the last half-century, there are still many problems that today’s computers can’t solve. Some problems simply await the next generation of semiconductors in order to become commercially important. But others will likely remain beyond the reach of classical computers forever. It is the prospect of finally solving these “classically intractable” problems which motivates potential providers and users at the dawn of the quantum computing era.

    Their enthusiasm is not misplaced. But, despite its enormous potential, assessing the timing and real-world impact of quantum computing has proven more difficult than for any of the other transformational technologies highlighted in our book, Ride the Wave. Fortunately, the fog is beginning to clear as breakthroughs emerge from research labs around the world.

    In the coming decades, Boston Consulting Group (or BCG) expects end-users of quantum computing to realize gains in the form of both cost savings and revenue opportunities of up to $850 billion annually. These gains will accrue first to firms in industries with complex simulation and optimization requirements. The way forward is likely to involve “a slow build” over the next few years, reaching a relatively modest $2 billion to $5 billion a year by 2024. But then value creation will accelerate rapidly as the technology and its commercial applications mature.
    The best estimates agree that the first payoffs will begin 4 or 5 years in the future and “the big quantum jackpot” probably lies over ten years out. That begs the question, “If quantum computing’s transformative value is at least five to ten years away, why are enterprises considering investments now?”

    The simple answer is that this is a radical technology which presents formidable ramp-up challenges, even for companies which already possess advanced supercomputing capabilities. Both quantum programming and the underlying “quantum tech stack” bear little resemblance to their classical counterparts, although the two technologies are likely to work quite closely together. Therefore, early adopters stand to gain expertise, visibility into technological gaps and key intellectual property that will put them at a structural advantage when quantum computing gains commercial traction.

    More importantly, experts believe that progress toward maturity in quantum computing will not follow a smooth continuous curve. Instead, quantum computing is likely to experience breakthroughs sporadically. Companies that have invested to integrate quantum computing into the workflow are far more likely to be in a position to capitalize on their experience and the leads they open up will be difficult for others to close.

    This will confer a substantial advantage in industries in which classically intractable computational problems lead to bottlenecks and missed revenue opportunities.

    The assessment of future business value begins with the question of what kinds of problems quantum computers can solve more efficiently than binary machines. There is no simple answer, but two indicators are the size and complexity of the calculations that need to be done.

    Consider drug discovery, for example. For scientists trying to design a compound that will attach itself to, and modify a target disease pathway, the critical first step is to determine the electronic structure of the molecule. But modeling the structure of a molecule of an everyday drug such as penicillin, which has 41 atoms at ground state, would require a classical computer with some 1086 bits; that involves more transistors than there are atoms in the observable universe. Therefore, such a machine is a physical impossibility. But for quantum computers, this type of simulation is well within the realm of possibility, requiring a processor with just 286 quantum bits, or qubits. The best opportunities for maximizing the impact of quantum computers seem to lie in:

    - Materials Design,
    - Drug Discovery,
    - Financial Services,
    - Computational Fluid Dynamics,
    - Transportation and Logistics,
    - Energy, and
    - Meteorology.

    Given this trend, we offer the following forecast for your consideration.

    First, the full potential of quantum computing will be realized in three phases over roughly three decades.

    The period through 2025 will be characterized by so-called Noisy Intermediate-Scale Quantum devices (or NISQs), which will become increasingly capable of performing useful, discrete functions characterized by relatively high error rates. These will most likely be used to exploit quantum heuristics, somewhat analogous to conventional neural networks.

    Companies including IBM, Google, and Rigetti, are anticipating technological breakthroughs in error mitigation techniques which will enable NISQ devices to produce the first quantum-advantaged experimental discoveries in simulation and combinatorial optimization. - Next, sometime between 2025 and 2035, quantum computers are expected to achieve so-called Broad Quantum Advantage , which will yield superior performance in a wide range of tasks which have real industrial significance.

    This phase will deliver a genuine quantum-leap over the speed, cost and quality possible with conventional binary machines. This era will require overcoming significant technical hurdles in “error correction” and other areas, as well as continuing increases in the power and reliability of quantum processors. In preparation for this phase, companies such as Zapata Computing are betting that quantum-advantaged molecular simulation will drive not only significant cost savings but the development of better products that reach the market sooner.

    Ultimately, a third phase called Full-Scale Fault Tolerance is expected to arrive during the 2040s. Achieving full-scale fault tolerance will require makers of quantum technology to overcome additional technical constraints, including problems related to scale and stability. But once they arrive, fault-tolerant quantum computers will transform a broad array of industries. They have the potential to vastly reduce trial-and-error and improve automation in the specialty-chemicals market, enable tail-event defensive trading and risk-driven high-frequency trading strategies in finance, and turbo-charge the productivity of “in silico drug discovery,” which has major implications for personalized medicine.

    Second, across multiple industries, quantum computing will increase incremental operating income by up to $850 billion a year by 2050.

    Assuming a P/E ratio of 15, quantum computing could contribute nearly $13 trillion to stock market valuations by 2050. According to the Boston Consulting Group, this payoff will be almost evenly split between incremental annual revenue streams and recurring cost efficiencies.

    Third, the biggest winners will be those who identify opportunities as early as possible and prepare to exploit them as soon as the technology arrives.

    That begs the question, “What can companies do today to get ready?” According to BCG , a good first step is performing a diagnostic assessment to determine the potential impact of quantum computing on your industry and then, if appropriate, developing a partnership strategy for capturing the value that can be created. The first part of the diagnostic involves a self-assessment of your company’s opportunities and challenges and its use of computing resources, ideally involving people from R&D and other functions. The key questions to ask include at least these four:

    1. Are we currently spending a lot of money or other resources to tackle problems using high-performance computers? If so, are these efforts yielding low impact, delayed, or piecemeal results that appear to “leave value on the table?”

    2. Does the presumed difficulty of solving simulation or optimization problems prevent us from trying high-performance computing or other computational solutions

    3. Are we spending resources on inefficient trial-and-error alternatives, such as wet-lab experiments or physical prototyping? And,

    4. Are any of the major problems we need to solve rooted in quantum-advantaged problem archetypes including combinatorial optimization, differential equations, linear algebra, or factorization?

    If the answer to any of these questions is yes, the next step is to perform a “quantum value diagnostic.” This starts by assessing where quantum computing could have an early or outsized impact on discrete “pain points” in particular industries. Behind each pain point, you’ll find a bottleneck for which there may be multiple solutions or a latent pool of income that can be tapped in many ways.

    Therefore, mapping opportunities must include solutions rooted in other technologies - such as machine learning - that may arrive on the scene sooner, at lower cost, or may be integrated more easily into existing workflows. Establishing a valuation over time for quantum computing in a given industry or for a given firm will require gathering and synthesizing expertise from a number of sources. These sources should include:

    - Industry business leaders who can attest to the business value of addressing a given pain point;

    - Industry technical experts who can assess the limits of current and future nonquantum solutions to each pain point; and

    - Quantum computing experts who can confirm that quantum computers will be able to solve the problem and when.

    Using this methodology, BCG has sized up the impact of quantum advantage for a number of sectors, with an emphasis on early opportunities.

    Consider the results.

    Materials Design and Drug Discovery
    On the face of things, no two fields of R&D more naturally lend themselves to quantum advantage than materials design and drug discovery. Even if some experts dispute whether quantum computers will have an advantage in modeling the properties of quantum systems, there is no question that the shortcomings of classical computers limit R&D in these areas. Materials design, in particular, is a slow lab-based process characterized by trial and error.

    According to R&D Magazine , for specialty materials alone, global firms spend upwards of $40 billion a year on candidate material selection, material synthesis, and performance testing. Improvements to this workflow will yield not only cost savings through efficiencies in design and reduced time to market but revenue uplift through net new materials and enhancements to existing materials. The benefits of design improvements yielding optimal synthetic routes would, in all likelihood, flow downstream, affecting the estimated $460 billion spent annually on industrial synthesis.

    The biggest benefit quantum computing offers is the potential for simulation, which for many materials requires computing power that binary machines do not possess. Reducing trial-and-error lab processes and accelerating the discovery of new materials is only possible if materials scientists can derive higher-level spectral, thermodynamic, and other properties from ground-state energy levels described by the Schrödinger equation.

    The problem is that none of today’s approximate solutions - from Hartree-Fock to density functional theory - can account for the quantized nature of the electromagnetic field. Current computational approximations only apply to a subset of materials for which interactions between electrons can effectively be ignored or easily approximated, and there remains a well-defined set of problems which need simulation-based solutions; outsized rewards can be expected to accrue the companies that manage to solve them first. These problems include simulations of:

    - strongly correlated electron systems (for high-temperature superconductors),
    - manganites with colossal magnetoresistance (needed for high-efficiency data storage and transfer),
    - multiferroics (for high-absorbency solar panels), and
    - high-density electrochemical systems (for lithium-air batteries).

    All of the major players in quantum computing, including IBM, Google, and Microsoft, have recently established partnerships or offerings in materials science and chemistry. Google’s partnership with Volkswagen, for example, is aimed at simulations related to high-performance batteries as well as other materials. Microsoft released a new chemical simulation library developed in collaboration with Pacific Northwest National Laboratory. And IBM, having run the largest-ever molecular simulation on a quantum computer in 2017, released an end-to-end stack for quantum chemistry in 2018.

    Potential end-users of the technology are embracing these efforts. One researcher at a leading global materials manufacturer believes that quantum computing “will be able to make a quality improvement on classical simulations in less than five years,” during which period value to end-users approaching some $500 million is expected to come in the form of design efficiencies (measured in terms of reduced expenditures across the R&D workflow). 

    As error correction enables functional simulations of more complex materials, “you’ll start to unlock new materials and it won’t just be about efficiency anymore,” a professor of chemistry reported. During the period of broad quantum advantage, BCG estimates that upwards of $5 billion to $15 billion a year in value (which they measure in terms of increased R&D productivity) will accrue to end-users, principally through the development of new and enhanced materials.

    Once full-scale fault-tolerant quantum computers become available, the value could reach the range of $30 billion to $60 billion a year, principally through new materials and extensions of in-market patent life as time-to-market is reduced. As the head of business development at a major materials manufacturer stated, “If unknown chemical relationships are unlocked, the current specialty chemical market [currently $51 billion in operating income annually] could double.”

    Quantum advantage in drug discovery will be later to arrive given the maturity of existing simulation methods for “established” small molecules. Nonetheless, in the long run, as quantum computers unlock simulation capabilities for molecules of increasing size and complexity, experts believe that drug discovery will be among the most valuable of all industrial applications.

    In terms of cost savings, the drug discovery workflow is expected to become more efficient, with in silico modeling increasingly replacing expensive in vitro and in vivo screening. But there is good reason to believe that there will be major top-line implications as well. Experts expect more powerful simulations not only to promote the discovery of new drugs but also to generate replacement value over today’s generics as larger molecules produce drugs with fewer side-effects. 

    Between reducing the $35 billion in annual R&D spending on drug discovery and boosting the $920 billion in yearly branded pharmaceutical revenues, quantum computing is expected to yield $35 billion to $75 billion in annual operating income once companies have access to fault-tolerant machines.

    Financial Services
    In recent history, few if any industries have been faster to adopt vanguard technologies than financial services. There is good reason to believe that the industry will quickly ramp up investments in quantum computing, which can be expected to address a clearly defined set of simulation and optimization problems - in particular, portfolio optimization in the short term and risk analytics in the long term. Investment money has already started to flow to startups, with Goldman Sachs and Fidelity investing in full-stack companies such as D-Wave, while RBS and Citigroup have invested in software players such as 1QBit and QC Ware.

    Discussions with quantitative investors about the pain points in portfolio optimization, arbitrage strategy, and trading costs make it easy to understand why. While investors use classical computers for all these problems today, the capabilities of these machines are limited - not so much by the number of assets or the number of constraints introduced into the model as by the types of constraints.

    For example, adding noncontinuous, nonconvex functions such as interest rate yield curves, trading lots, buyin thresholds, and transaction costs to investment models makes the optimization “surface” so complex that classical optimizers often crash, simply take too long to compute, or, worse yet, mistake a local optimum for the global optimum. To get around this problem, analysts often simplify or exclude such constraints, sacrificing the fidelity of the calculation for reliability and speed. Such trade-offs, many experts believe, would be unnecessary with quantum combinatorial optimization. 

    Exploiting the probability amplitudes of quantum states is expected to dramatically accelerate portfolio optimization, enabling a full complement of realistic constraints and reducing portfolio turnover and transaction costs; according to the head of portfolio risk at one major U.S. bank, this equals as much as 2% to 3% of assets under management.

    BCG calculates that income gains from quantum portfolio optimization should reach $200 million to $500 million in the next three to five years and accelerate swiftly with the advent of enhanced error correction during the period of broad quantum advantage. The resulting improvements in risk analytics and forecasting should drive value creation beyond $5 billion a year.

    As the brute-force Monte Carlo simulations used for risk assessment today give way to more powerful “quantum walk algorithms,” faster simulations will give banks more time to react to negative market risk (resulting returns of as much as 12 basis points). The expected benefits include better intraday risk analytics for banks and near-real time risk assessment for quantitative hedge funds.

    One former quantitative analyst at a leading U.S. hedge fund complained, “Brute-force Monte Carlo simulations for economic spikes and disasters take a whole month to run.” Bankers and hedge fund managers hope that, with the kind of “whole-market simulations” theoretically possible on full-scale fault-tolerant quantum computers, they will be able to better predict black-swan events and even develop risk-driven high-frequency trading.

    “Moving risk management from positioning defensively to an offensive trading strategy is a whole new paradigm,” noted a former trader at a U.S. hedge fund. Coupled with enhanced model accuracy and positioning against extreme tail events, reductions in capital reserves (by as much as 15% in some estimates) will position quantum computing to deliver $40 billion to $70 billion in annual operating income to banks and other financial services companies as the technology matures.

    Computational Fluid Dynamics
    Computational fluid dynamics (or CFD), which involves simulating the precise flow of liquids and gases under “changing conditions” on a computer, is a costly but critical process for many companies in a range of industries. Spending on simulation software by companies using CFD to design airplanes, spacecraft, cars, medical devices, and wind turbines exceeded $4 billion in 2017, but the costs that weigh most heavily on decision-makers in these industries are those related to expensive trial-and-error testing such as wind tunnel and wing flex tests. 

    These direct costs, together with the revenue potential of energy-optimized design, have many experts excited by the prospect of introducing quantum simulation into the workflow. The governing equations behind CFD, known as the Navier-Stokes equations, are nonlinear partial differential equations and thus a natural fit for quantum computing.

    The first bottleneck in the CFD workflow is an optimization problem in the preprocessing stage that precedes any fluid dynamics algorithm. Because of the computational complexity involved in these algorithms, designers create a mesh to simulate the surface of an airplane wing or other object. The mesh is composed of geometric primitives whose vertices form a constellation of nodes. Most classical optimizers impose a limit on the number of nodes in a mesh that can be simulated efficiently to about one billion. 

    This forces the designer into a tradeoff between how fine-grained and how large a simulated surface can be. Quantum optimization is expected to relieve the designer of that constraint so that bigger pieces of the puzzle can be solved, more accurately. Improving this preprocessing stage of the design process is expected to lead to operating-income gains of between $1 billion and $2 billion across industries through reduced costs and faster revenue realization.

    As quantum computers mature, the benefits of improved mesh optimization are expected to be surpassed by those from accelerated and improved simulations. As with mesh optimization, the tradeoff in fluid simulations is between speed and accuracy. For large simulations with more than 100 million cells, today’s run times can be weeks long, even using very powerful supercomputers. 

    And that is with the use of simplifying heuristics, such as “approximate turbulence models.” During the period of broad quantum advantage, experts believe that quantum simulation could enable designers to reduce the number of heuristics required to run Navier-Stokes solvers in manageable time periods, resulting in the replacement of expensive physical testing with accurate moving-ground aerodynamic models, unsteady aerodynamics, and turbulent-flow simulations. 

    The benefits to end-users in terms of cost reductions are expected to start at $1 billion to $2 billion a year during this period. With full-scale fault tolerance, BCG says value creation could as much as triple; at that point, experts anticipate that quantum linear solvers will unlock predictive simulations that not only obviate physical testing requirements but also lead to product improvements (such as improved fuel economy) and manufacturing yield optimization. CFD value creation in the phase of full-scale fault tolerance is expected to range from $19 billion to $37 billion a year in added operating income.

    Other Industries
    During the NISQ era, more than 40% of the value created in quantum computing is expected to come from materials design, drug discovery, financial services, and applications related to CFD. But applications in other industries will show early promise as well. Examples include:

    Transportation and Logistics. Using quantum computers to address inveterate optimization challenges (such as the traveling salesman problem and the minimum spanning tree problem) is expected to lead to efficiencies in route optimization, fleet management, network scheduling, and supply chain optimization.

    Energy. With the era of easy-to-find oil and gas coming to an end, companies are increasingly reliant on wave-based geophysical processing to locate new drilling sites. Quantum computing could not only accelerate the discovery process but also contribute to drilling optimizations for both greenfield and brownfield operations. And,

    Meteorology. Many experts believe that quantum simulation will improve large-scale weather and climate forecasting technologies, which would not only enable earlier storm and severe weather warnings but also bring speed and accuracy gains to industries that depend on weather-sensitive pricing and trading strategies.

    And if quantum computing becomes integrated into machine learning workflows, the list of affected industries would expand dramatically, with salient applications wherever

    - predictive capabilities (such as supervised learning and deep learning),
    - principal component analysis (such as dimension reduction), and
    - lustering analysis (for anomaly detection)

    provides an advantage. While experts are divided on the timing of quantum computing’s impact on machine learning, the stakes are so high that many of the leading players are already putting significant resources into it today, with promising early results. 

    For example, in conjunction with researchers from Oxford and MIT, a group from IBM recently proposed a set of methods for optimizing and accelerating support vector machines, which are applicable to a wide range of classification problems but have fallen out of favor in recent years because they quickly become inefficient as the number of predictor variables rises and the feature space expands. The eventual role of quantum computing in machine learning is still being defined, but early theoretical work, at least for optimizing current methods in linear algebra and support vector machines, shows promise.

    Resource List:
    1. Boston Consulting Group. May 13, 2019. Matt Langione, Corban Tillemann-Dick, Amit Kumar, and Vikas Taneja. Where Will Quantum Computers Create Value - and When?

    2. Boston Consulting Group. NOVEMBER 15, 2018. Philipp Gerbert and Frank Ruess. The Next Decade in Quantum Computing - and How to Play.

    3. Boston Consulting Group. October 6, 2020. Jean-François Bobier, Jean-Michel Binefa, Matt Lan – gione, and Amit Kumar. IT’S TIME FOR FINANCIAL INSTITUTIONS TO PLACE THEIR QUANTUM BETS.

    4. Harvard Business Review. July/August 2021. Francesco Bova, Avi Goldfarb, and Roger Melko.Quantum Computing Is Coming. What Can It Do?

    5. EPJ Quantum Technology. 29 January 2021. Francesco Bova, Avi Goldfarb & Roger G. Melko.Commercial applications of quantum computing.

    6. TechRepublic.com August 24, 2021. Karen Roby. Expert: Now is the time to prepare for the quantum computing revolution .

    7. TechRepublic.com. June 29, 2021. Esther Shein. The quantum decade: IBM predicts the 2020s will see quantum begin to solve real problems.