Bosonsampling
This presentation is the property of its rightful owner.
Sponsored Links
1 / 18

BosonSampling PowerPoint PPT Presentation


  • 83 Views
  • Uploaded on
  • Presentation posted in: General

BosonSampling. Scott Aaronson (MIT) Talk at BBN, October 30, 2013. The Extended Church-Turing Thesis (ECT) Everything feasibly computable in the physical world is feasibly computable by a (probabilistic) Turing machine.

Download Presentation

BosonSampling

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Bosonsampling

BosonSampling

Scott Aaronson (MIT)

Talk at BBN, October 30, 2013


Bosonsampling

The Extended Church-Turing Thesis (ECT)

Everything feasibly computable in the physical world is feasibly computable by a (probabilistic) Turing machine

Shor’s Theorem:Quantum Simulation has no efficient classical algorithm, unless Factoring does also


Bosonsampling

So the ECT is false … what more evidence could anyone want?

  • Building a QC able to factor large numbers is damn hard! After 16 years, no fundamental obstacle has been found, but who knows?

    • Can’t we “meet the physicists halfway,” and show computational hardness for quantum systems closer to what they actually work with now?

  • Factoring might be have a fast classical algorithm! At any rate, it’s an extremely “special” problem

    • Wouldn’t it be great to show that if, quantum computers can be simulated classically, then (say) P=NP?


A rudimentary type of quantum computing involving only non interacting photons

BosonSampling (A.-Arkhipov 2011)

Classical counterpart: Galton’s Board

Replacing the balls by photons leads to famously counterintuitive phenomena, like the Hong-Ou-Mandel dip

A rudimentary type of quantum computing, involving only non-interacting photons


In general we consider a network of beamsplitters with n input modes locations and m n output modes

n identical photons enter, one per input mode

Assume for simplicity they all leave in different modes—there are possibilities

The beamsplitter network defines a column-orthonormal matrix ACmn, such that

In general, we consider a network of beamsplitters, with n input “modes” (locations) and m>>n output modes

where

nnsubmatrix of A corresponding to S

is the matrix permanent


Bosonsampling

Example

For Hong-Ou-Mandel experiment,

In general, an nn complex permanent is a sum of n! terms, almost all of which cancel

How hard is it to estimate the “tiny residue” left over?

Answer:#P-complete, even for constant-factor approx

(Contrast with nonnegative permanents!)


Bosonsampling

So, Can We Use Quantum Optics to Solve a #P-Complete Problem?

That sounds way too good to be true…

Explanation: If X is sub-unitary, then |Per(X)|2 will usually be exponentially small. So to get a reasonable estimate of |Per(X)|2 for a given X, we’d generally need to repeat the optical experiment exponentially many times


Bosonsampling

Better idea: Given ACmn as input, let BosonSampling be the problem of merely sampling from the same distribution DA that the beamsplitter network samples from—the one defined by Pr[S]=|Per(AS)|2

Theorem (A.-Arkhipov 2011): Suppose BosonSampling is solvable in classical polynomial time. Then P#P=BPPNP

Upshot: Compared to (say) Shor’s factoring algorithm, we get different/stronger evidence that a weaker system can do something classically hard

Better Theorem: Suppose we can sample DA even approximately in classical polynomial time. Then in BPPNP, it’s possible to estimate Per(X), with high probability over a Gaussian random matrix

We conjecture that the above problem is already #P-complete. If it is, then a fast classical algorithm for approximateBosonSampling would already have the consequence thatP#P=BPPNP


Bosonsampling

Related Work

Valiant 2001, Terhal-DiVincenzo 2002, “folklore”:A QC built of noninteracting fermions can be efficiently simulated by a classical computer

Knill, Laflamme, Milburn 2001: Noninteracting bosons plus adaptive measurementsyield universal QC

Jerrum-Sinclair-Vigoda 2001: Fast classical randomized algorithm to approximate Per(A) for nonnegative A

Gurvits 2002: O(n2/2) classical randomized algorithm to approximate an n-photon amplitude to ± additive error (also, to compute k-mode marginal distribution in nO(k) time)


Bosonsampling

BosonSampling Experiments

Last year, groups in Brisbane, Oxford, Rome, and Vienna reported the first 3-photon BosonSampling experiments, confirming that the amplitudes were given by 3x3 permanents

# of experiments > # of photons!


Bosonsampling

  • Obvious Challenges for Scaling Up:

  • Reliable single-photon sources (optical multiplexing?)

  • Minimizing losses

  • Getting high probability of n-photon coincidence

  • Goal (in our view): Scale to 10-30 photons

  • Don’t want to scale much beyond that—both because

  • you probably can’t without fault-tolerance, and

  • a classical computer probably couldn’t even verify the results!


Bosonsampling

Scattershot BosonSampling

Exciting new idea, proposed by Steve Kolthammer, for sampling a hard distribution even with highly unreliable (but heralded) photon sources, like SPDCs

The idea: Say you have 100 sources, of which only 10 (on average) generate a photon. Then just detect which sources succeed, and use those to define your BosonSampling instance!

Complexity analysis goes through essentially without change

Issues:Increases depth of optical network needed. Also, if some sources generate ≥2 photons, need a new hardness assumption


Bosonsampling

Recent Criticisms of Gogolin et al. (arXiv:1306.3995)

Suppose you ignore which actual photodetectors light up, and count only the number of timeseach output configuration occurs. In that case, the BosonSampling distribution DA is exponentially-close to the uniform distribution U

Response:Why would you ignore which detectors light up??

The output of almost any algorithm is also gobbledygook if you ignore the order of the output bits…


Bosonsampling

Recent Criticisms of Gogolin et al. (arXiv:1306.3995)

OK, so maybe DA isn’t close to uniform. Still, the very same arguments we gave for why polynomial-time classical algorithms can’t sample DA, suggest that they can’t even distinguish DA from U!

Response:That’s why we said to focus on 10-30 photons—a range where a classical computer can verify a BosonSampling device’s output, but the BosonSampling device might be “faster”!(And 10-30 photons is probably the best you can do anyway, without quantum fault-tolerance)


Bosonsampling

More Decisive Responses(A.-Arkhipov, arXiv:1309.7460)

Theorem: Let ACmn be a Haar-random BosonSampling matrix, where m≥n5.1/. Then with 1-O() probability over A, the BosonSampling distribution DA has Ω(1) variation distance from the uniform distribution U

Histogram of (normalized) probabilities under DA

Under U

Necessary, though not sufficient, for approximately sampling DA to be hard


Bosonsampling

Theorem (A. 2013): Let ACmn be Haar-random, where m>>n. Then there is a classical polynomial-time algorithm C(A) that distinguishes DA from U (with high probability over A and constant bias, and using only O(1) samples)

Strategy: Let AS be the nnsubmatrix of A corresponding to output S. Let P be the product of squared 2-norms of AS’s rows. If P>E[P], then guess S was drawn from DA; otherwise guess S was drawn from U

P under uniform distribution (a lognormal random variable)

AS

P under a BosonSampling distribution

A


Bosonsampling

Using Quantum Optics to Prove that the Permanent is #P-Complete[A., Proc. Roy. Soc. 2011]

Valiant showed that the permanent is #P-complete—but his proof required strange, custom-made gadgets

  • We gave a new, arguably more transparent proof by combining three facts:

  • n-photon amplitudes correspond to nn permanents

  • (2) Postselected quantum optics can simulate universal quantum computation [Knill-Laflamme-Milburn 2001]

  • (3) Quantum computations can encode #P-complete quantities in their amplitudes


Bosonsampling

Open Problems

Prove that Gaussian permanent approximation is #P-hard (first step: understand distribution of Gaussian permanents)

Can the BosonSampling model solve classically-hard decision problems? With verifiable answers?

Can one efficiently sample a distribution that can’t be efficiently distinguished from BosonSampling?

Are we still sampling a hard distribution with unheralded photon losses, or Gaussian initial states?

Similar hardness results for other natural quantum systems (besides linear optics)?Bremner, Jozsa, Shepherd 2010: Another system for which exact classical simulation would collapse PH


  • Login