
I. Starting with a Nobel Hotspot: Why Quantum Goes "From Microscopic to Engineering"?

The 2025 Nobel Prize in Physics was awarded to John Clark,Michel H. Devoret and John M. Martinis for their breakthroughs in quantummechanics, which laid the foundations for quantum computing. That moment in thespotlight brought quantum science back into the limelight — and naturallyprompted renewed discussions about quantum computing. At the same time, in amajor technical advance, Google announced its new quantum-algorithm “QuantumEchoes”, running on their Willow quantum chip, achieving a verifiable quantumadvantage with about 13,000 × speed over classical supercomputers. With both atheoretical high-point and a practical milestone now online, it’s worth asking:quantum computing and quantum mechanics both carry “quantum” in the name — buthow exactly are they related? Before diving into quantum computing, we firstneed to briefly explain what “quantum” means.
Quantum refers to the odd but powerful rules that governtiny particles like atoms or electrons — imagine electrons appearing as “probabilityclouds”, able to exist in multiple places at once, unlike everyday objects.Quantum computing leverages those rules to tackle problems classical computersstruggle with.
II. A Brief History of Quantum Computing: From Thought Experiments to Multiple Parallel Paths

The development of quantum computing can be seen as a technological long march spanning over 40 years: progressing step by step from bold concepts to initial prototypes. The story dates back to the early 1980s, when physicist Richard Feynman acutely proposed the idea of "simulating computation with quantum systems" in 1981—because classical computers are extremely inefficient at simulating quantum phenomena, it would be better to let "nature solve problems with its own rules." In 1985, David Deutsch further proposed the theoretical framework of the quantum Turing machine, proving that quantum computers are feasible in principle and have potential advantages for certain problems.
A truly industry-exciting milestone occurred in 1994: Peter Shor published a quantum algorithm that could efficiently factor large integers, meaning future quantum computers are expected to outperform classical computation on mathematical problems (this directly threatened encryption systems like RSA, drawing global attention). Soon after, in 1996, Lov Grover's database search algorithm once again demonstrated the charm of quantum acceleration. These theoretical breakthroughs injected immense confidence and funding into quantum computing.
Subsequently, experimental progress began to catch up: in the late 1990s, scientists achieved simple operations with two quantum bits in laboratories, proving that quantum computing was not just theoretical. In 2011, Canadian startup D-Wave announced a quantum annealing computing device with 128 qubits, touted as the world's first commercial quantum. Entering the 2010s, tech giants and startups competed fiercely in the quantum race: various technological routes such as superconducting qubits and ion traps flourished, and the number of qubits on quantum chips gradually climbed from a dozen to over fifty. In 2019, Google's "Sycamore" processor (53 qubits) completed a specific computational task in 200 seconds, which classical supercomputers would reportedly take thousands of years to accomplish—this event was dubbed "quantum supremacy," marking the first time a quantum computer surpassed classical computation in a specific aspect, although this advantage was limited to specific scenarios and sparked some debate.
Stepping into the 2020s, quantum computing R&D in China, the US, and other countries further accelerated: companies like IBM successively released chips with hundreds of qubits, and China also launched its 72-qubit superconducting quantum computer "Origin Wukong" (which went online in 2024 and is used for fine-tuning billion-parameter AI models), demonstrating continuous progress in quantum computing hardware. Although current quantum computers are still relatively primitive and susceptible to errors, the historical process from nothing to something is remarkable—yesterday's whimsical ideas are gradually growing into prototypes, and quantum computing is accelerating towards reality.
III. Principles at a Glance: Three "Superpowers" and Two "Hard Thresholds"
What is a Qubit?
Imagine a classical computer bit as a coin: it's either heads (0) or tails (1), with no in-between. A qubit, however, is like a spinning coin: before observation, it exists in a "fuzzy" state of both heads and tails simultaneously. This core ability to represent multiple possibilities enables exponential computational acceleration.

Quantum computers are highly anticipated precisely because they leverage three unique "superpowers" inherent in quantum mechanics. Below are simple analogies for each:
To utilize these quantum superpowers for computation, researchers construct corresponding quantum circuits. In classical computers, bits perform operations through logic gates like AND, OR, and NOT; similarly, in quantum computers, we apply quantum gates to qubits to manipulate their states. Through a series of quantum gate combinations, quantum algorithms are executed, evolving the input state into the desired output state distribution.
However, it is important to emphasize that quantum computing does not outperform classical computing on all problems. Some tasks are still more efficient for classical computers, and the value of quantum computing lies in solving certain problems that classical computers can barely handle, rather than completely replacing classical computing. Currently known quantum algorithm advantages are mainly concentrated in specific areas, such as integer factorization, database search, and quantum chemistry simulation, and are still far from truly general-purpose super-high-speed computing.
Despite the enticing prospects, truly harnessing the "three superpowers" of quantum computing requires overcoming two "hard thresholds."
IV. From Research to Commercialization
What can quantum computers actually do? The answer evolves in layers. More importantly, they will fundamentally reshape daily life and society—like the Industrial Revolution transformed production from steam engines onward. Below we analyze specific transformations, crypto industry impacts, and energy consumption.

V. Industry Landscape: Routes, Cloud, and the "Pick-and-Shovel Sellers"
The current quantum computing industry is advancing along multiple technological routes simultaneously, and the entire ecosystem is being driven by cloud services and supporting industries. Below we detail mainstream routes and leading companies (based on 2025 advancements).
Technological Routes & Leading Companies:



Routes complement each other; no single winner short-term. Cloud Service Leaders: IBM Quantum (first open, 100K+ users), Amazon Braket (multi-route aggregator), Microsoft Azure Quantum (strongest software ecosystem), Google Quantum AI (research-focused).
Besides these, there are some "Pick-and-Shovel" Leaders to follow:
VI. Future Opportunities
For investors interested in quantum computing, this is a frontier field full of opportunities, but also accompanied by long-term uncertainties. We suggest following several main lines to grasp the investment directions.
Disclaimer
The content of this website is intended for professional investors (as defined in the Securities and Futures Ordinance (Cap. 571) or regulations made thereunder).
The information in this website is for informational purposes only and does not constitute a recommendation or offer to provide services.
All information in this website should not be construed as professional or investment advice. Therefore, you should seek independent professional advice. Any use of this website and its contents is at your own risk.
The Company may terminate or change the information, products or services provided in this website at any time without prior notice to you.
No content on the website may be reproduced or publicly transmitted without the explicit consent and authorisation of the Poseidon Partner.