* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Introduction to Quantum Information Theory
Wave–particle duality wikipedia , lookup
Ensemble interpretation wikipedia , lookup
Renormalization group wikipedia , lookup
Relativistic quantum mechanics wikipedia , lookup
Bohr–Einstein debates wikipedia , lookup
Theoretical and experimental justification for the Schrödinger equation wikipedia , lookup
Double-slit experiment wikipedia , lookup
Basil Hiley wikipedia , lookup
Renormalization wikipedia , lookup
Particle in a box wikipedia , lookup
Bell test experiments wikipedia , lookup
Delayed choice quantum eraser wikipedia , lookup
Topological quantum field theory wikipedia , lookup
Path integral formulation wikipedia , lookup
Quantum decoherence wikipedia , lookup
Quantum field theory wikipedia , lookup
Hydrogen atom wikipedia , lookup
Quantum dot wikipedia , lookup
Coherent states wikipedia , lookup
Scalar field theory wikipedia , lookup
Copenhagen interpretation wikipedia , lookup
Bell's theorem wikipedia , lookup
Measurement in quantum mechanics wikipedia , lookup
Quantum fiction wikipedia , lookup
Quantum electrodynamics wikipedia , lookup
Probability amplitude wikipedia , lookup
Many-worlds interpretation wikipedia , lookup
Orchestrated objective reduction wikipedia , lookup
Quantum computing wikipedia , lookup
Quantum entanglement wikipedia , lookup
Symmetry in quantum mechanics wikipedia , lookup
Density matrix wikipedia , lookup
EPR paradox wikipedia , lookup
History of quantum field theory wikipedia , lookup
Quantum machine learning wikipedia , lookup
Interpretations of quantum mechanics wikipedia , lookup
Quantum group wikipedia , lookup
Quantum teleportation wikipedia , lookup
Canonical quantization wikipedia , lookup
Quantum key distribution wikipedia , lookup
Quantum cognition wikipedia , lookup
Introduction to Quantum Information Theory
Iordanis Kerenidis1
CNRS and LRI-Univ. de Paris-Sud
Quantum computation and information studies how information is encoded
in nature according to the laws of quantum mechanics and what this means for
its computational power. In this note, we present a short and rather schematic
introduction to quantum information theory by drawing comparisons to classical probability theory. For more details on quantum information theory and
computation we refer to [3].
A binary random variable X is a system with two possible states 0 and 1.
Similarly, a quantum bit (qubit) is a quantum mechanical system, which can be
in a state |0i, |1i or any convex combination of these states. In other words, a
quantum bit is a unit vector in a two dimensional Hilbert space a0 |0i + a1 |1i,
where a0 , a1 ∈ C and |a0 |2 + |a1 |2 = 1. By tensoring such systems together
we can define larger quantum states, for example over log n qubits as |φi =
Pn−1
Pn−1
2
i=0 ai |ii, with
i=0 |ai | = 1.
A random variable X with probability distribution P = {p0 , p1 } evolves by
multiplying the probability vector by a stochastic matrix S, i.e. a matrix that
preserves the `1 -norm. The new probability vector is P 0 = S · P . Moreover, a
measurement of the random variable has Pr[X = b] = pb , pb ∈ [0, 1].
Let us see how a quantum bit evolves. A quantum bit |φi = a0 |0i + a1 |1i
can evolve by a unitary matrix U , i.e. a matrix that preserves the `2 -norm, and
the new state becomes |φ0 i = U · |φi. In addition, we can perform a projective
measurement of a state |φi in an orthonormal basis {b1 , b2 , . . . , bn } and have
Pr[outcome is bi ] = |hφ|bi i|2 .
More generally, we can define a mixed quantum state, i.e. a classical probability distribution over quantum states. For example, a mixed state ρ can
be in an ensemble of states {|φi i} with probabilities pi . We can rewrite a
mixed
Pnstate as a hermitian, positive, trace-one matrix, called density matrix
ρ = i=1 pi |φi ihφi |. The density matrix contains all necessary information about
a quantum state. More precisely, the quantum state ρ evolves by a unitary U
†
to P
the state ρ0 = U
has Pr[outcome bi ]
PρU and a projective measurement
P
2
=
pi |hφi |bk i| =
pi hbk |φi ihφi |bk i = hbk | (
pi |φi ihφi | ) |bk i = hbk |ρ|bk i.
Let us note that two mixed states may look very different as an ensemble of
quantum states, however they might correspond to the same density matrix. For
example,
( √3
√
3 1
n
|0i − 1 |1i), w.p. 1/ 3
ρ=
1
√
(|0i
2
|0i,
− |1i), w.p. 1/2
w.p. 1/2
2
ρ=
|0i,
|1i,
2
w.p.
w.p.
3
4 (1
1
4 (1
−
−
1
√
)
3
1
√
)
3
ρ=
4
− 14
−
4
1
4
We now introduce a notion of entropy of a mixed quantum state. Note that
the ShannonPentropy of a random variable X with distribution P is defined as
H(X) = − pi log pi and captures the randomness in a measurement of the
variable. We define a similar notion for quantum states, however as we saw, a
mixed state can be described as different distributions over ensembles of states.
Hence, we look at the description of a state as a density
Pmatrix and define the
von Neumann entropy of a quantum state as S(ρ) = − λi log λ, where λi are
the eigenvalues of the matrix. Since the matrix is positive and has trace one, the
eigenvalues play the role of a probability distribution. In fact, they represent the
probabilities of outcomes of a measurment in the basis of the eigenvectors of the
state, which is the measurement that minimizes the Shannon entropy.
The notions of Shannon and von Neumann entropy share many important
properties, for example they are always positive and they are upper bounded
by the size of the system. Moreover, we can define conditional von Neumann
entropy and mutual information for quantum states similarly to the classical
case. In addition, we can prove important properties like strong subadditivity
and Fano’s inequality. However, there are differences between the two measures,
for example the conditional von Neumann entropy could take negative values.
Quantum Information theory is a powerful tool for the study of quantum
information. A main question is whether quantum information is more powerful
than classical information. A celebrated result by Holevo, shows that quantum
information cannot be used to compress classical information. In other words, in
order to transmit n random classical bits, one needs to transmit no less than n
quantum bits. This might imply that quantum information is no more powerful
than classical information. This however is wrong in many situations.
In the model of Communication Complexity, one can show that transmiting quantum information results to exponential savings on the communication
needed to solve specific problems ([4, 1]). Moreover, quantum information enables
us to perform unconditionally secure cryptographic primitives, for example key
distribution, which are impossible in the classical world. Last, quantum information can be used as a mathematical theory for the study classical information.
For example, one can get optimal bounds for classical locally decodable codes
by reducing the problem to a quantum encoding problem and using quantum
information theory to resolve it ([2]).
References
1. D. Gavinsky, J. Kempe, I. Kerenidis, R. Raz, R. de Wolf. Exponential separations for one-way quantum communication complexity, with applications to cryptography In Proceedings of the 39th ACM Symposium on Theory of Computing
(STOC), 2007.
2. I. Kerenidis, R. de Wolf. Exponential lower bound for 2-query locally decodable
codes via quantum argument. In STOC 2003 , pages 106–115,.
3. M. A. Nielsen and I. L. Chuang. Quantum Computation and Quantum Information. Cambridge University Press, 2000.
4. R. Raz. Exponential separation of quantum and classical communication complexity. In Proceedings of 31st ACM STOC, pages 358–367, 1999.