I worked at one of the quantum computing co's on their compiler stack (so pretty much pure classical compute stuff), but in order to have even a baseline understanding of the computations and programming using qubits, I had to first get a better intuition for the underlying quantum mechanics at play. This was a great introduction to the physics underpinning the computations:
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
It does! They also still have all their summer schools up that you can go through step by step. Although I must promote Strawberry fields as I believe photonic integrated systems really is the better option.
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.
I have "Essential Mathematics for Quantum Computing" by Woody and "Non-Standard Computation" by Gramß, et al. Both were worth reading, but assumed a bit of background with "foundations of computation."
1) Generally the two models of QC are the digital/circuit model (analogous to digital logic gates, with some caveats, such as reversibility of operations, no-cloning theorem), and analog computation (tuning the parameters of a continuous-time quantum system in your lab such that the system produces useful output)
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
The Nielsen/Chuang book is what i see recommended everywhere and so am definitely going to get it. What others would you recommend?
I had recently asked a similar question about books on "Modern Physics" (essentially Quantum Physics + Relativity) here https://news.ycombinator.com/item?id=46473352 so given your profile, what would be your recommendations?
PS: You might want to add your website url to your HN profile since your Physics Notes might be helpful to a lot of other folks too. :-)
These are the two books i had zeroed in on before asking here.
However, how approachable is the "Classical and Quantum Computation" book? Mathematics is fine as long as it is accessible. Also how good is the explanation of analogy/comparison between concepts from "Classical Computation" vs. "Quantum Computation"? I believe this is the best way to learn this subject and hence am quite interested to know more about how this book does it.
The classic text is Nielsen and Chuang's "Quantum Computation and Quantum Information" [0]. Whatever else you choose to supplement this book with, it is worth having in your library.
Nielsen and Chuang has the clearest exposition of quantum mechanics I've seen anywhere. Last year I was trying to learn quantum mechanics, not necessarily quantum computation, just out of a general interest in theoretical physics. I started with physics textbooks (Griffiths and Shankar) but it only really "clicked" for me when I read the first few chapters of Nielsen and Chuang.
conformist | 4 hours ago
slwvx | 4 hours ago
nilslice | 4 hours ago
https://www.youtube.com/watch?v=lZ3bPUKo5zc&list=PLUl4u3cNGP...
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
https://www.ibm.com/quantum/qiskit
ktallett | 4 hours ago
ktallett | 4 hours ago
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.
[OP] rramadass | 29 minutes ago
What are some good resources that you would recommend to study and understand the above?
Also do you think QC will ever become mainstream like classical computing?
jesuslop | 4 hours ago
OhMeadhbh | 4 hours ago
[OP] rramadass | 26 minutes ago
hershkumar | 4 hours ago
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
[OP] rramadass | 7 minutes ago
The Nielsen/Chuang book is what i see recommended everywhere and so am definitely going to get it. What others would you recommend?
I had recently asked a similar question about books on "Modern Physics" (essentially Quantum Physics + Relativity) here https://news.ycombinator.com/item?id=46473352 so given your profile, what would be your recommendations?
PS: You might want to add your website url to your HN profile since your Physics Notes might be helpful to a lot of other folks too. :-)
godsmokescrack | 4 hours ago
More mathy: A. Yu. Kitaev, A. H. Shen, M. N. Vyalyi, "Classical and Quantum Computation"
A killer app: Peter Shor, "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer"
Some course notes: https://math.mit.edu/~shor/435-LN/
[OP] rramadass | 32 minutes ago
However, how approachable is the "Classical and Quantum Computation" book? Mathematics is fine as long as it is accessible. Also how good is the explanation of analogy/comparison between concepts from "Classical Computation" vs. "Quantum Computation"? I believe this is the best way to learn this subject and hence am quite interested to know more about how this book does it.
danielam | 3 hours ago
[0] https://a.co/d/aPsexRB
fasterik | 3 hours ago