bits, qubits and neuronssalishan.ahsc-nm.org/uploads/4/9/7/0/49704495/2017-camp.pdf · bits, qubits...
TRANSCRIPT
Bits,QubitsandNeurons
Thenear(andnotsonear)futureofCompu9ngBillCamp
Preamble
• (ThiscouldbeusedasacoursesyllabusJ)
Let’slookatunexpectedconnec1ons
TheIsingModelprovides
Thesimplestnon-trivialmodelof: amagnet,aferroelectric,aliquid,abinaryalloy,aglass,aquantumfieldtheory,quantumcompu9ng,neuralnets,
…
TheIsingModelinaparallel
fieldH=-Σi,j{Jijσziσzj-Σhiσzi}σzi=+/-1
OmnisComputa9ointrespartesdivisusest#
• Turing/VonNeumannBased
• Neuro-Inspired
• Quantum-based
#Allofcompu9ngisdividedintothreeparts--withapologiestothosewhohavereadCaesar
QuantumCompu9ng
GateandCircuit-based
Adiaba9cQuantumRelaxa9on-based
Awarp-speedlookatquantummechanics
• Newtongotitrightif…• Thingsweren’ttoo9ny• Tooheavy• Ortoofast!• Newton’sphysicsisdeterminis9candreversibleinthesmallbutirreversibleinthelarge…tryunmixingfluidsbyreversingthepaddle.
Demys9fyingQuantumPhysics
• “Things”aredescribedbytheir“State”,avectorinalinearspaceofvectors(AHilbertSpace).
• ReversibleandDeterminis9c– Un9lyoumeasuresomething
• Newton’sphysicalvariablesarenolongernumbers– Posi9on,energy,momentum,angularmomentum,dipolemoment,…
• Theyare(self-adjoint)operatorsinthatHilbertspace
Demys9fyingQuantumPhysics
• Self-adjointoperatorsA=A* (Thatis,Aij=Aji*)
• Ordermacers:Operatorsdon’talwayscommutee.g.,[x,p]=(xp-px)=ih/2π Heisenberg!
• Thingsaremeasuredusinginnerproducts– Innerproduct:<ψ|φ> = Σjψj
*φj(j=1,N)– MeasuredEnergyinstateψ =<Η>=<ψ|{Η|ψ>
Schrödinger’sEqua9on
• iδt|ψ(t)>=Η|ψ(t)>(weletħ=h/2π=1)
• Thismeansthat|ψ(t)>=e-iΗt |ψ(0)>
• SinceΗ=Η*,theevolu9onoperator,U(t)=e-iΗt , isunitary:
• U*=U-1U*U=I
QUBITS
=|0>isastatewithitsqubit=0=|1>isastatewithitsqubit=1Fornormalbitsthisisthewholestory.Notsoforcubits!
1
0
0
1
QUBITS
=|ψ>isastatewithitsqubitina superposi9onofanemptyandafull bit:
|ψ>=α|0>+β|1>Thisseemsweird;butitistrueandisthesourceof(nearly)allofquantumcompu9ng’spromise!
α
β
Quantumparallelism
• Wecreateaquantumcircuitusingquantumgates.
• Q-gatesareunitaryoperatorsthatoperateonthebit-states.
• Wecancreateastar9ngstateforallthebits:
• |Ψ0>=|ψ0,1;ψ0,2;ψ0,3;…ψ0,j;…ψ0,N>• Where|ψ0,j>(=α0,j|0>+β0,j|1>)istheini9alstateofthejthcubit
QuantumparallelismTodoacalcula9onwemovetheN-bitstateΨ0throughthesuccessivesetofLQ-gates:|Ψf>=ULUL-1
…Uk…U1|Ψ0>
|Ψf>=|ψf,1;ψf,2;ψf,3;…ψf,j;…ψf,N>And
|ψf,j>=αf,j|0>+βf,j|1>So,weevolve2NbitpaSernsatonce!Unfortunatelyallthatparallelismcollapsesatoutput!
PauliOperatorsand1-bitQ-gates
11
1-1=√2�H=σx+σz(HadamardGate)
10
0i=S(phasegate)
10
01 =I
10
0-1=σz
01
10=σx
0-i
i0=σy
10
0√i=T(π/8gate)
2-bitQ-gates:C-NOT
�
¢
|A> |A>
|B> |B+A>
2-bitQ-gates:C-NOT
UC-NOT=1000010000010010
UC-NOTisunitary–checkitout!
2-bitQ-gates:C-NOTexamples
UC-NOT
00
011011
� =
00
011110
!
2-bitQ-gates:C-NOTgeneralcase
UC-NOT
αAαB
αAβBβAαB
βAβB
�
αAαB
αAβBβAβBβAαB
=!
C-NOTisimportant
AmazinglyC-NOTandthesingle-bitQ-gates{I,σx,σy,σz}areallyouneedtotocreateanymul9-Q-bitgate.Theyformauniversalsetofbuildingblocksforquantumcircuitry
QuantumNoise&Q-ECC• Areallimita9onofofquantumcompu9ngisnoise,whicheventuallybreaksthecoherenceofquantumsuperposi9onsofstates.• Fortunatelyquantumcodeshavebeeninventedwhich,ifnoiseisbelowathreshold,guaranteecoherence.• Nonetheless,thisisnotatrivialissueatall!
Isquantumreallybecer?
Some9mes–inprinciple!Forexample,ThankstoBobGriffiths’productformforthequantumFouriertransform,QuantumalgorithmscandoFTsexponen9allyfasterthantheFFT.
Isquantumreallybecer?
UnfortunatelyitisseeminglyimpossibletoinputanarbitrarystatetotheQFTItisalsoimpossibleinprincipletooutputtheQFTNonetheless,QFTenablesustodoquantumphasees9ma9on,whichallowsustodoquantumorderingandfactoring–exponen9allyfasterthanNumbertheore9csieves!
Realquantumcomputers?
• Tomakequantumcompu9ngareality,weneedtobeableto– Buildlotsofrobustbits– Efficientlyperformalargesetofunitarytransforma9ons
– Prepareini9alstatesreliably– MeasureoutputsNoneofthesearecurrentlyingoodshapeforanycandidatetechnology
Takeawaymessage
• Duetowavefunc9oncollapseuponmeasuringoutput,weonlygetN-wayinstead2N-wayparallelismintheoutput.
• Wecannotquerysimula9onsinmidstream• Onlyinafewcaseshavequantumalgorithmsbeenshowntohavehugegainsin9metosolu9on
• Quantumseemsdes9nedfornowtobebestforques9onsthatcanbewithminimaloutput
Adiaba9cQuantumRelaxa9on
• Createacollec9onofcubits.PreparetheminthelowesteigenstateoftheHamiltonianofasimplequantumsystem.
• Slowlytransi9onthesystembyopera9ngonthequbitswithanincreasing“perturba9on”thateffec9velyremovestheoriginalsimpleHamiltonianandtransi9onsittoacomplexHamiltonianthatrepresentsa“Hard”problem.
• Outputthefinalstates.
TopicalExample–IsingModelinatransversemagne9cfield
H(t)={1-τ(t)}H0+τ(t)H∞
H0=-Σhiσxi
H∞=-ΣJijσziσzj[σzj,σxi]=σyjδj,i
i
i,j
τ(0)=0,τ(∞)=1τissmoothlyIncreasing(σ×σ=iσ)!
IsingQAR
• MaptheIsingmodelontoafeasiblegraph(network),G,withconnec9onsrepresen9ng{Jij}• {Jij}andGarechosentomaptheIsingproblemH∞ontotheNP-Hardproblemtobeacacked.• EvolvetheproblemontheQAR.
IssuesAdiaba9cbehavior:Thislookslikeacon9nua9onmethodforthegroundstateofa9medependentHamiltonianThestar9ngandendingHamiltoniansdonotcommute;sotheydonotshareeigenvectorsEhrenfest’sTheoremguaranteessuccessoftheadiaba9capproxima9on(con9nua9on):ifevolu9onisslowenoughandifsymmetryissuesandlevelcrossingsdonotviolateassump9ons.
Issues
Iamnotawareofresearchthatshowsdefini9velywhencon9nua9onsuccesscondi9onsaremetinQARs.Decoherenceduetoquantumnoiseis(inmyken)s9llpoorlyunderstoodinQARsQARsaremorelimitedinscopethancircuitquantumcomputers.Howlimitedisunknown.
Neuro-mime9ccompu9ng
PartofaNeuralCortexformaratbrain
BRAINCELL
Rathermorecomplexthanintegra9ngsigmoids,hyberbolictangents,orrec9fiedlinearfunc9ons
Nonetheless,itworks!
AliclehistoryofNNmilestones
MACHINELEARNING/ARTIFICIALINTELLIGENCE
• In1943McCullochandPics(M-P)introducedthear9ficialneuronandpointedtopacernclassifica9onasfrontandcentertoatheoryofintelligence!
• In1949Hebbproposedthatlearningchangesthemorphologyofintelligentnetworks.“Therichgetricherandthepoorpoorer.”(training)
• In1949,Rosenblacintroducedtheperceptron:anM-Pnetworkwithtrainablesynapses.
• In1969,MinskyandPapertpointedouttheseverefailingsoftheoringinalperceptronandshowedwhatneededtobedonetomakeitaTuringmachine.
MACHINELEARNING/ARTIFICIALINTELLIGENCE
• In1982,HopfieldintroducedtheHopfieldnet–thebasisformanyofourmodernadvances,thoughnotpowerfulbytoday’sstandards.
• In1982,Hintonetal.introducedRestrictedBoltzmannMachines
• In1984,Fukushima’sNeocognitronovercamemanyoftheproblemsofscale,transla9onandrota9on.
• In1985,AmitpointedoutthattheHopfieldNetwasinmanywaysiden9caltotheIsingSpinGlassinitsMeanFieldApproxima9on.
MACHINELEARNING/ARTIFICIALINTELLIGENCE
• In1986,Hintonetal.introducedbackpropaga9onasalearningmethod.
• In1987,WolnesdiscoveredthatproteinfoldingisalsostronglyanalogoustothebehaviorofIsingSpinglasses–anddiscoveredminimalfrustra9onandspinfunnels.
• In1987,PedersonandAndersonshowedhowtospeedupSBMsandRBMsbyusingtheIsingMeanFieldapproxima9on
• In2004,LeCunandBocouemphasizedtheroleofstochas9cgradientdescentindeepnetworks.
MACHINELEARNING/ARTIFICIALINTELLIGENCE
• In2006,Hintonetal.introduceddeepbeliefnetworks
• In2006,Bengioetal.analyzedDeepauto-encoders:Deepnetworkswithgreedylayer-by-layertraining
• In2012,Hintonetal.analyzedandemphasizedtheroleofhierarchicalabstrac9oninDeepNeuralNetworks.
MACHINELEARNING/AI
• In2014,MehtaandSchwabshowedanexactmappingbetweenKadanoff’svaria9onalapproxima9onfortheRenormaliza9onGroupapproachtotheIsingModelanddeepneuralnets!
• In2015LeCunetaldemonstratedtherela9onshipofdeepNNstotheIsingspinglassinthesphericalapproxima9on–Insightsgalore!
From:DeepLearningMethodsandApplica1onsLiDengandDongYu
From:DeepLearningMethodsandApplica1onsLiDengandDongYu
StartofuseofDNNs
Simplifiedneuron
InputAxon Synapse
dendrite
Σjwjxj+b fw0x0
w1x1
w2x2
OutputAxon
f(Σjwjxj+b)
Cellbody
Ac9va9onfunc9on
Ac9va9onFunc9ons
Sigmoid:f(x)=1/(1+e-x)(mostlikebiology)HyperbolicTangent:f(x)=[ex-e-x]/[ex+e-x](mostusefulforIsingModelcomparisons)ReLU:f(x)=Θ(x)�x(Θ(x)istheunitstepfn)(currentlymostfavored)
PrototypicalNeuralNetworks
���
HiddenLayersInputLayer
OutputLayer
(RestrictedBoltzmannMachinelookslikethiswithoutthe(red)intra-layerconnec9ons)
TwolayersofaNetworktopologyforaStochas9cBoltzmannMachine
Lossfunc9onforanRBM
H=-Σv,h{Jv,htzvtzh-Σi=v,hbitzi}vissummedoverthevisibleunitsvissummedoverthehiddenunits
tzi=0,1(=(2σzi-1)/2
ThisistheIsingmodelonabi-par1tegraph!
Whatdowe(typically)dowithNNs?
• Aneuralnetworkispresentedwithtrainingdata.• Itsjobistolearnthatdatathenextrapolateformittocorrectlyrecognizeandclassifynewdata.
• Trivialexample:showtheNNmanypicturesofcatsandthenaskittoclassifyotherpicturesbywhethertheycontaincatsandperhapshowmanycatstheycontain,andperhapstheirbreeds,andperhapstheirmaturity,….
HowNNsdoitisfascina9ng!• 1.Prepara9onofdata• 2.Ini9aliza9onofthenet• 3.Pre-trainingifused• 4.Forwardpropaga9on:learningbyac9va9ngneurons• 5.Back-propaga9on:changingtheweightsintheac9va9onfunc9onstominimizethe“Loss”func9on(theerrorinlearning)
• 6.repeat• Typicallytherearetwovisiblelayers(in,out)andoneormore“hidden”layers.
• 7.Outputop9malclassifica9ons
DeepNeuralNetworks(DNNs)
• DNNsarenetworkswithmanyhiddenlayers• (typically10—20layers)
• Theyarehuge– (o�en100millionplusnetworkparameters)
• Theirinputlayeriso�enaDeepBeliefNetwork(DBN):astackofrestrictedBoltzmannmachinesusedonlearning
DeepLearning
• InDeepLearning,learningandtrainingisdonelayer-by-layerfrominputthroughmanyhiddenlayerstooutput
• Aswegoupthestackoflayerslearning/trainingbecomesmoreabstractandpowerful.
• Thiswasarealsurprise!• Whyithappensisnotfullyunderstood
DeepLearning
• Recently,evidenceismoun9ngthattheanswerliesintwopiecesofphysics:– theIsingModelanditsSpinGlasses– theresolu9onoftheirdeepphysicsviatheRenormaliza9onGroupTheory(RGT).
• Thisisalsohappeninginacemptstodevelopsuccessfulfoldingmethodsforproteins.
Therestofthestory• Ihaveglossedovernearly75yearsofR&D!• Ihaven’ttoldyouhowRGTworks• Ihaven’ttoldyouhowseveralofus(myselfand,
independently,KenWilson,andacoupleofyearslaterMasuoSuzuki)discoveredthattheIsingModelind-dimensionsis1-1andontoa(scalar)quantumfieldtheoryind-1dimensions(50yearsago!)
• Ihaven’ttoldyouhowtheIsingmodelcanbeusedinshowingthatinaninfiniteuniversethereareaninfinite#ofBillCampstellingthisstorytoaninfinite#ofiden9calaudiences
• Ihaven’tdiscussedLandauer,orBlackholes,theholographictheoryofmemory,informa9onandMaldecenauniverses.
GodwillingandtheCreekdon’trise,there’salwaysnextyearJ