discrepancy and sdps

81
Discrepancy and SDPs Nikhil Bansal (TU Eindhoven)

Upload: lovey

Post on 23-Feb-2016

86 views

Category:

Documents


0 download

DESCRIPTION

Discrepancy and SDPs. Nikhil Bansal (TU Eindhoven). Outline. Discrepancy: definitions and applications Basic results: upper/lower bounds Partial Coloring method (non-constructive) SDPs: basic method Algorithmic six std. deviations Lovett- Meka result - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Discrepancy and SDPs

Discrepancy and SDPs

Nikhil Bansal (TU Eindhoven)

Page 2: Discrepancy and SDPs

Outline

Discrepancy: definitions and applicationsBasic results: upper/lower boundsPartial Coloring method (non-constructive)

SDPs: basic methodAlgorithmic six std. deviationsLovett-Meka resultLower bounds via SDP duality (Matousek)

Page 3: Discrepancy and SDPs

MaterialClassic: Geometric Discrepancy by J. Matousek

Papers:• Bansal. Constructive algs. for disc. minimization, FOCS 2010• Matousek. The determinant lower bound is almost tight. Arxiv’11• Lovett, Meka. Disc. minimization by walking on edges. Arxiv’12• Other related recent works.

Survey (main ideas): Bansal. Semidefinite opt. and disc. theory.

Page 4: Discrepancy and SDPs

Discrepancy: What is it?Study of gaps in approximating the continuous by the discrete.

Original motivation: Numerical Integration/ SamplingHow well can you approximate a region by discrete points ?

Discrepancy: Max over intervals I|(# points in I) – (length of I)|

Estimate as

Error

0 n

Page 5: Discrepancy and SDPs

Discrepancy: What is it?

Problem: How uniformly can you distribute points in a grid. “Uniform” : For every axis-parallel rectangle R | (# points in R) - (Area of R) | should be low.

n1/2

n1/2

Discrepancy: Max over rectangles R|(# points in R) – (Area of R)|

Page 6: Discrepancy and SDPs

Distributing points in a gridProblem: How uniformly can you distribute points in a grid. “Uniform” : For every axis-parallel rectangle R | (# points in R) - (Area of R) | should be low.

Uniform Random Van der Corput Set

n= 64points

n1/2 discrepancy n1/2 (loglog n)1/2 O(log n) discrepancy!

Page 7: Discrepancy and SDPs

Quasi-Monte Carlo Methods

With N random samples: Error Quasi-Monte Carlo Methods:

Can discrepancy be O(1) for 2d grid?No. [Schmidt’77]

d-dimensions: [Halton-Hammersely’60] [Roth’64]n) [Bilyk,Lacey,Vagharshakyan’08]

Page 8: Discrepancy and SDPs

Discrepancy: Example 2Input: n points placed arbitrarily in a grid.Color them red/blue such that each rectangle is colored as

evenly as possible

Discrepancy: max over rect. R ( | # red in R - # blue in R | )

Continuous: Color each element 1/2 red and 1/2 blue (0 discrepancy)

Discrete:Random has about O(n1/2 log1/2 n)Can achieve O(log2.5 n)

Page 9: Discrepancy and SDPs

Discrepancy: Example 2Input: n points placed arbitrarily in a grid.Color them red/blue such that each rectangle is colored as

evenly as possible

Discrepancy: max over rect. R ( | # red in R - # blue in R | )

Discrete:Can achieve O(log2.5 n)

Exercise: O(log4 n)Optional: O(log2.5 n)

Why do we care?

Page 10: Discrepancy and SDPs

Combinatorial DiscrepancyUniverse: U= [1,…,n] Subsets: S1,S2,…,Sm

Color elements red/blue so each set is colored as evenly as possible.

Find : [n] ! {-1,+1} toMinimize |(S)|1 = maxS | i 2 S (i) |

If A is a incidence matrix.Disc(A) =

S1

S2

S3

S4

Page 11: Discrepancy and SDPs

ApplicationsCS: Computational Geometry, Comb. Optimization, Monte-Carlo simulation, Machine learning, Complexity, Pseudo-Randomness, …

Math: Dynamical Systems, Combinatorics, Mathematical Finance, Number Theory, Ramsey Theory, Algebra, Measure Theory, …

Page 12: Discrepancy and SDPs

Hereditary Discrepancy

Discrepancy a useful measure of complexity of set system

Hereditary discrepancy: herdisc (U,S) = maxU’ ½ U disc (U’, S|U’)

Robust version of discrepancy (usually same as discrepancy)

S1

S2

S’1

S’2

1 2 … n 1’ 2’ … n’

But not so robust

Page 13: Discrepancy and SDPs

RoundingLovasz-Spencer-Vesztermgombi’86: Given any matrix A, and can round x to s.t.

Proof: Round the bits of x one by one.

Remark: A is TU iff it is integer matrix with herdisc(A) =1. Ghouila-Houri test for TU matrices.

Page 14: Discrepancy and SDPs

Rounding

LSV’86 result guarantees existence.How to find it efficiently? Nothing known until recently.

Thm [B’10]. Can round efficiently so that

Page 15: Discrepancy and SDPs

More rounding approaches

Bin Packing [Eisenbrand, Palvolgyi, Rothvoss’11]OPT LP + O(1) ?

Yes. For constant item sizes, if k-permutation conjecture is true.(Recently, Newman-Nikolov’11 disproved the k-permutation conjecture)

Technique refined further by Rothvoss’12.(Entropy rounding method)

Page 16: Discrepancy and SDPs

Dynamic Data Structures

N points in a 2-d region. weights updated over time.

Query: Given an axis-parallel rectangle R, determine the total weight on points in R.

Preprocess:1) Low query time2) Low update time (upon weight change)

Page 17: Discrepancy and SDPs

ExampleLine: Interval queries Trivial: Query = O(n) Update = 1

Query = 1 Update = O()Query = 2 Update = O(n)

Query = O(log n) Update = O(log n)

Recursively for 2-d.

Page 18: Discrepancy and SDPs

What about other queries?

Circles arbitrary rectangles aligned triangleTurns out

Reason: Set system S of query objects + points has large discrepancy (about )

Larsen-Green’11

Page 19: Discrepancy and SDPs

IdeaAny data structure is maintaining D

A good data structure implicitly computes: D = A P

A = row sparse P = Column sparse (low query time) (low update time)

DQuery

points

w

A P PrecomputeAggregator

Page 20: Discrepancy and SDPs

Outline

Discrepancy: definitions and applicationsBasic results: upper/lower boundsPartial Coloring method (non-constructive)

SDPs: basic methodAlgorithmic six std. deviationsLovett-Meka resultLower bounds via SDP duality (Matousek)

Page 21: Discrepancy and SDPs

Basic Results

What is the discrepancy of a general system on m sets?

Page 22: Discrepancy and SDPs

Best Known Algorithm

Random: Color each element i independently as x(i) = +1 or -1 with probability ½ each.

Thm: Discrepancy = O (n log m)1/2

Pf: For each set, expect O(n1/2) discrepancyStandard tail bounds: Pr[ | i 2 S x(i) | ¸ c n1/2 ] ¼ e-c2

Union bound + Choose c ¼ (log m)1/2

Analysis tight: Random actually incurs ((n log m)1/2).

Henceforth, focus on m=n case.

Page 23: Discrepancy and SDPs

Better Colorings Exist![Spencer 85]: (Six standard deviations suffice) Always exists coloring with discrepancy · 6n1/2

(In general for arbitrary m, discrepancy = O(n1/2 log(m/n)1/2)

Tight: For m=n, cannot beat 0.5 n1/2 (Hadamard Matrix)

Will explore further in exercises. For matrix Awhere is the least eigenvalue of

Page 24: Discrepancy and SDPs

Better Colorings Exist![Spencer 85]: (Six standard deviations suffice) Always exists coloring with discrepancy · 6n1/2

(In general for arbitrary m, discrepancy = O(n1/2 log(m/n)1/2)

Inherently non-constructive proof (pigeonhole principle on exponentially large universe)

Challenge: Can we find it algorithmically ?Certain algorithms do not work [Spencer]

Conjecture [Alon-Spencer]: May not be possible.

Page 25: Discrepancy and SDPs

Beck Fiala Thm

U = [1,…,n] Sets: S1,S2,…,Sm

Suppose each element lies in at most t sets (t << n).

[Beck Fiala’ 81]: Discrepancy 2t -1.(elegant linear algebraic argument, algorithmic result)(note: random does not work)

Beck Fiala Conjecture: O(t1/2) discrepancy possibleOther results: O( t1/2 log t log n ) [Beck] O( t1/2 log n ) [Srinivasan] O( t1/2 log1/2 n ) [Banaszczyk]

S1

S2

S3

S4

Non-constructive

Page 26: Discrepancy and SDPs

Approximating DiscrepancyQuestion: If a set system has low discrepancy (say << n1/2) Can we find a good discrepancy coloring ?

[Charikar, Newman, Nikolov 11]: Even 0 vs. O (n1/2) is NP-Hard

(Matousek): What if system has low Hereditary discrepancy? herdisc (U,S) = maxU’ ½ U disc (U’, S|U’)

Useful for the rounding application.

S1

S2

S’1

S’2

1 2 … n 1’ 2’ … n’

Page 27: Discrepancy and SDPs

Two Results

Thm 1: For any set system, can findDiscrepancy · Hereditary discrepancy.

Thm 2: Can get Spencer’s bound constructively. That is, O(n1/2) discrepancy for m=n sets.

Other Problems: Constructive bounds (matching current best) k-permutation problem [Spencer, Srinivasan,Tetali] geometric problems , Beck Fiala setting (Srinivasan’s bound) …

Page 28: Discrepancy and SDPs

Relaxations: LPs and SDPs

Not clear how to use.

Linear Program is useless. Can color each element ½ red and ½ blue. Discrepancy of each set = 0!

SDPs (LP on vi ¢ vj, cannot control dimension of v’s) | i 2 S vi |2 · n 8 S |vi|2 = 1 Intended solution vi = (+1,0,…,0) or (-1,0,…,0).Trivially feasible: vi = ei (all vi’s orthogonal)

Yet, SDPs will be a major tool.

Page 29: Discrepancy and SDPs

Punch line

SDP very helpful if “tighter” bounds needed for some sets.

|i 2 S vi |2 · 2 n | i 2 S’ vi|2 · n/log n |vi|2 · 1

Not apriori clear why one can do this.Entropy Method.

Algorithm will construct coloring over time and use several SDPs in the process.

Tighter bound for S’

Page 30: Discrepancy and SDPs

Partial Coloring Method

Page 31: Discrepancy and SDPs

A Question

•  

-n n

Page 32: Discrepancy and SDPs

A Question

•  

-n n

Page 33: Discrepancy and SDPs

A Question

•  

-n n

Page 34: Discrepancy and SDPs

An Improvement

Can be improved to

For a random {-1,1} coloring s, with prob.

There are {-1,1} colorings s, with

Page 35: Discrepancy and SDPs

Algorithmically ?

Easy: 1/poly(n)Answer: Pick any poly(n) colorings.

[Karmarkar-Karp’81]:

Huge gap: Major open question

Remark: {-1,+1} not enough. Really need to allow 0 also.E.g

Page 36: Discrepancy and SDPs

Yet another enhancement

There is a {-1,0,1} coloring with at least n/2 {-1,1}’s s.t.

Split [-n,n] into buckets of size At least sums fall in same bucket

Claim: Some two s’ and s’’ lie in same bucket and differ in at least n/2 coordinates.

Again consider s = (s’-s’’)/2 s’ = (1,-1, 1 , …, 1,-1,-1)s’’ = (-1,-1,-1, …, 1,1, 1)

Page 37: Discrepancy and SDPs

Proof of Claim

[Kleitman’66] Isoperimetry for cube.Hamming ball B(v,r) has the smallest diameter for a given number of vertices.

|B(v,n/4)| <

i.e. in any set of {-1,1} vectors, some two at hamming distance >= n/2.

Page 38: Discrepancy and SDPs

Spencer’s proof

Thm: For any set system on n sets.Disc(S) = O

Page 39: Discrepancy and SDPs

Spencer’s O(n1/2) resultPartial Coloring Lemma: For any system with m sets, there exists a coloring on ¸ n/2 elements with

discrepancy O(n1/2 log1/2 (2m/n))[For m=n, disc = O(n1/2)]

Algorithm for total coloring: Repeatedly apply partial coloring lemma Total discrepancy O( n1/2 log1/2 2 ) [Phase 1]+ O( (n/2)1/2 log1/2 4 ) [Phase 2] + O((n/4)1/2 log1/2 8 ) [Phase 3]+ … = O(n1/2)

Let us prove the lemma for m = n

Page 40: Discrepancy and SDPs

Proving Partial Coloring Lemma

Call two colorings X1 and X2 “similar” for set Sif |X1(S) – X2(S) | · 20 n1/2

Key Lemma: There exist k=24n/5 colorings X1,…,Xk such that every two Xi, Xj are similar for every set S1,…,Sn.

Some X1,X2 differ on ¸ n/2 positionsConsider X = (X1 – X2)/2

Pf: X(S) = (X1(S) – X2(S))/2 2 [-10 n1/2 , 10 n1/2]

X1 = ( 1,-1, 1 , …,1,-1,-1)X2 = (-1,-1,-1, …,1, 1, 1) X = ( 1, 0, 1 , …,0,-1,-1)

Page 41: Discrepancy and SDPs

Proving Partial Coloring Lemma

Pf: Associate with coloring X, signature = (b1,b2,…,bn) (bi = bucket in which X(Si) lies ) Wish to show: There exist 24n/5 colorings with same signature

Note: Number of possible signatures about and colorings, so naïve counting does not work.

Idea: Not all signatures equally likely. Entropy

-10 n1/2-30 n1/2 10 n1/2 30 n1/2

0 21-1-2

Page 42: Discrepancy and SDPs

Entropy

For a discrete random variable X.H(X) =

1. Uniform distribution on k points: H(X) = k2. If X distributed on k points H(X) log(k)

If H(X k, then some .

3. Subadditivity: H(X,Y) H(X) + H(Y)

Page 43: Discrepancy and SDPs

Proving Partial Coloring Lemma

Pf: Associate with coloring X, signature = (b1,b2,…,bn) (bi = bucket in which X(Si) lies ) Wish to show: There exist 24n/5 colorings with same signature

Choose X randomly: Induces distribution on signatures.Entropy () · n/5 implies some signature has prob. ¸ 2-n/5.

Entropy ( ) · i Entropy( bi) [Subadditivity of Entropy]

bi = 0 w.p. ¼ 1- 2 e-50, = 1 w.p. ¼ e-50 = 2 w.p. ¼ e-450 ….

-10 n1/2-30 n1/2 10 n1/2 30 n1/2

0 21-1-2

Ent(b1) · 1/5

Page 44: Discrepancy and SDPs

A useful generalizationPartial coloring with non-uniform discrepancy S for set S

S-3S -S 3S 5S

0 1 2-1-2

For each set S, consider the “bucketing”

Suffices to have s Ent (bs) · n/5

Or, if S = s |S|1/2 , then s g(s) · n/5

g() ¼ e-2/2 > 1 ¼ ln(1/) < 1

Bucket of n1/2/100has penalty ¼ ln(100)

Page 45: Discrepancy and SDPs

General Partial Coloring

Thm: There is a partial coloring with discrepancy for set S, provided

Very flexible. In Spencer’s setting, can require sets to have discrepancy 0. Exercise: Prove disc. partial coloring for Beck Fiala.

s g(S/|S|1/2) · n/5

g() ¼ e-2/2 > 1 ¼ ln(1/) < 1

Page 46: Discrepancy and SDPs

Recap

Partial Coloring: S ¼ 10 n1/2 gives low entropy n/5 ) 24n/5 colorings exist with same signature. ) some X1,X2 with large hamming distance. (X1 – X2) /2 gives the desired partial coloring.

Trouble: 24n/5/2n is an exponentially small fraction.

Only if we could find the partial coloring efficiently…

Page 47: Discrepancy and SDPs

PART 2

Page 48: Discrepancy and SDPs

Outline

Discrepancy: definitions and applicationsBasic results: upper/lower boundsPartial Coloring method (non-constructive)

SDPs: basic methodAlgorithmic six std. deviationsLovett-Meka resultLower bounds via SDP duality (Matousek)

Page 49: Discrepancy and SDPs

Algorithms

Thm (B’10): For any matrix A, there is a polynomial time algorithm to find a -1,+1 coloring x, s.t.= Herdisc(A)

Corollary: Rounding with error = Herdisc(A).

Page 50: Discrepancy and SDPs

Algorithm (at high level)

Cube: {-1,+1}n

Analysis: Few steps to reach a vertex (walk has high variance) Disc(Si) does a random walk (with low variance)

start

finish

Algorithm: “Sticky” random walk Each step generated by rounding a suitable SDP Move in various dimensions correlated, e.g. t

1 + t2 ¼ 0

Each dimension: An ElementEach vertex: A Coloring

Page 51: Discrepancy and SDPs

An SDP Hereditary disc. ) the following SDP is feasible

SDP: Low discrepancy: |i 2 Sj

vi |2 · 2

|vi|2 = 1

Rounding: Pick random Gaussian g = (g1,g2,…,gn) each coordinate gi is iid N(0,1)

For each i, consider i = g ¢ vi

Obtain vi 2 Rn

Page 52: Discrepancy and SDPs

Properties of GaussianLemma: If g 2 Rn is random Gaussian. For any v 2 Rn, g ¢ v is distributed as N(0, |v|2)

Pf: N(0,a2) + N(0,b2) = N(0,a2+b2) g¢ v = i v(i) gi » N(0, i v(i)2)

Lemma: Gaussian is rotationally invariant.

Page 53: Discrepancy and SDPs

Properties of RoundingLemma: If g 2 Rn is random Gaussian. For any v 2 Rn, g ¢ v is distributed as N(0, |v|2)Pf: N(0,a2) + N(0,b2) = N(0,a2+b2) g¢ v = i v(i) gi » N(0, i v(i)2)

1. Each i » N(0,1) 2. For each set S, i 2 S i = g ¢ (i2 S vi) » N(0, · 2) (std deviation ·)

SDP:|vi|2 = 1|i2 S

vi|2 ·2

Recall: i = g ¢ vi

’s mimics a low discrepancy coloring (but is not {-1,+1})

Page 54: Discrepancy and SDPs

Algorithm Overview Construct coloring iteratively.Initially: Start with coloring x0 = (0,0,0, …,0) at t = 0.At Time t: Update coloring as xt = xt-1 + (t

1,…,tn)

( tiny: 1/n suffices)

x(i)

xt(i) = (1i + 2

i + … + ti)

Color of element i: Does random walk over time with step size ¼ N(0,1)

Fixed if reaches -1 or +1.

time

-1

+1

Set S: xt(S) = i 2 S xt(i) does a random walk w/ step N(0,· 2)

Page 55: Discrepancy and SDPs

Analysis Consider time T = O(1/2)

Claim 1: With prob. ½, an element reaches -1 or +1.Pf: Each element doing random walk with size ¼ .Recall: Random walk with step 1, is ¼ O(t1/2) away in t steps. Claim 2: Each set has O() discrepancy in expectation.Pf: For each S, xt(S) doing random walk with step size ¼ .

At time T = O((log n)/) 1. Prob. that an element still floating < 1/(10 n).2. Expected discrepancy of set = (By Chernoff, all have discrepancy O( )

Page 56: Discrepancy and SDPs

Recap At each step of walk, formulate SDP on unfixed variables. Use some (existential) property to argue SDP is feasible Rounding SDP solution -> Step of walk

Properties of walk: High Variance -> Quick convergence Low variance for discrepancy on sets -> Low discrepancy

Page 57: Discrepancy and SDPs

Refinements

Spencer’s six std deviations result: Goal: Obtain O(n1/2) discrepancy for any set system on m = O(n) sets.

Random coloring has n1/2 (log n)1/2 discrepancy

Previous approach seems useless: Expected discrepancy for a set O(n1/2), but some random walks will deviate by up to (log n)1/2 factor

Need an additional idea to prevent this.

Page 58: Discrepancy and SDPs

Spencer’s O(n1/2) resultPartial Coloring Lemma: For any system with m sets, there exists a coloring on ¸ n/2 elements with

discrepancy O(n1/2 log1/2 (2m/n))[For m=n, disc = O(n1/2)]

Algorithm for total coloring: Repeatedly apply partial coloring lemma Total discrepancy O( n1/2 log1/2 2 ) [Phase 1]+ O( (n/2)1/2 log1/2 4 ) [Phase 2] + O((n/4)1/2 log1/2 8 ) [Phase 3]+ … = O(n1/2)

Page 59: Discrepancy and SDPs

Algorithm (at high level)

Cube: {-1,+1}n

Analysis: Few steps to reach a vertex (walk has high variance) Disc(Si) does a random walk (with low variance)

start

finish

Algorithm: “Sticky” random walk Each step generated by rounding a suitable SDP Move in various dimensions correlated, e.g. t

1 + t2 ¼ 0

Each dimension: An ElementEach vertex: A Coloring

Page 60: Discrepancy and SDPs

An SDP Suppose there exists partial coloring X:1. On ¸ n/2 elements2. Each set S has |X(S)| · S

SDP: Low discrepancy: |i 2 Sj

vi |2 · S2

Many colors: i |vi|2 ¸ n/2 |vi|2 · 1

Pick random Gaussian g = (g1,g2,…,gn) each coordinate gi is iid N(0,1)

For each i, consider i = g ¢ vi

Obtain vi 2 Rn

Page 61: Discrepancy and SDPs

Properties of Rounding

Lemma: If g 2 Rn is random Gaussian. For any v 2 Rn, g ¢ v is distributed as N(0, |v|2)

1. Each i » N(0,i2) where i

2 · 12. Total variance is large i i

2 ¸ n/23. For each set S, i 2 S i » N(0, · S

2) (std deviation · S)

SDP:|vi|2 · 1i |vi|2 ¸ n/2|i2 S

vi|2 · S2

Recall: i = g ¢ vi

’s sort of like a partial coloring, but not quite!

Page 62: Discrepancy and SDPs

Algorithm Overview Construct coloring iteratively.Initially: Start with coloring x0 = (0,0,0, …,0) at t = 0.

At Time t: Update coloring as xt = xt-1 + (t1,…,t

n) ( tiny: say 1/n)

x(i)

xt(i) = (1i + 2

i + … + ti)

Color of i does a random walk overtime (martingale) with step size ¼

Color of i is fixed if reaches -1 or +1.

time

-1

+1

xt(S) =i 2 S xt(i) also does a random walk over time (step ¼ S)

Page 63: Discrepancy and SDPs

Algorithm idea Fact: Random walk with step size 1, is ¼ O(t1/2) away in t steps.

Consider time t = O(1/2)Claim 1: High total variance, i i

2¸ n/2 ) (n) variables reach -1 or +1.

Pf: Each element doing random with size ¼ . Correlated walks + aggregate bound, but use “energy increment” + “Markov”.

Claim 2: Low std deviation, · S for S ) each it has about S discrepancy in expectation.

Pf: For each S, xt(S) doing random walk with step size S

Same as partial coloring!! Back to square one?

Page 64: Discrepancy and SDPs

Proving ProgressTrouble: Each is a martingale with steps ’s correlated and

Can show Pr[ at least half of these walks hit in steps] is at least ½.

Know: +1/-1 random walk hits in O(n) steps with O(1) prob.What is the simplest proof of this?Energy: Consider Exercise: Formalize this)

Page 65: Discrepancy and SDPs

Algorithm idea Fact: Random walk with step size 1, is ¼ O(t1/2) away in t steps.

Consider time t = O(1/2)Claim 1: High total variance, i i

2¸ n/2 ) (n) variables reach -1 or +1.

Pf: Each element doing random with size ¼ . Correlated walks + aggregate bound, but use “energy increment” + “Markov”.

Claim 2: Low std deviation, · S for S ) each it has about S discrepancy in expectation.

Pf: For each S, xt(S) doing random walk with step size S

Page 66: Discrepancy and SDPs

Second IdeaUse flexibility in choosing S in the entropy condition.Entropy Condition: S g( S/n1/2 ) · n/5

Initially S ¼ n1/2

If some set S starts to get discrepancy way more than n1/2, can reduce S = n1/2 for <1.

Key Point: Barely incur any penalty For < 1, g() ¼ ln (1/)

Can set S = 1 for O(1/log n) fraction of sets!

Page 67: Discrepancy and SDPs

Proof: O( (n log log log n)1/2) Bound

Each element’s color doing a random walk with step size about .Run for O(1/2) steps.

Initially: Set S = 5 n1/2 Each set’s discrepancy does random walk with step size ¼ 5 n1/2

Call set S dangerous if discrepancy > 10 n1/2 (log log log n)1/2.For dangerous set S, set S = n1/2/ log n(very unlikely now to exceed n1/2 discrepancy in 1/2 steps).

Chance a set is ever dangerous · exp( -2 log log log n) < 1/(log log n)2.

Entropy for dangerous set g (1/log n) ¼ log log n.So, total entropy increase · ( log log n ) ¢ (n / (log log n)2) << n.

Page 68: Discrepancy and SDPs

Algorithm

Initially write SDP with S = c n1/2

Each set S does random walk and expects to reach discrepancy of O(S) = O(n1/2)

Some sets will become problematic. Reduce their S on the fly.Not many problematic sets, and entropy penalty low.

0 20n1/2 30n1/2 35n1/2 …

Danger 1 Danger 2 Danger 3 …

Page 69: Discrepancy and SDPs

Remarks

Construct coloring over time by solving sequence of SDPs (guided by existence results)

Works quite generally

Can be derandomized [Bansal-Spencer](use entropy method itself for derandomizing + usual tech.)E.g. Deterministic six standard deviations can be viewed as a way to

derandomize something stronger than Chernoff bounds.

Page 70: Discrepancy and SDPs

Lovett Meka’12

Our algorithm still uses partial coloring method.Did not give a new proof of Spencer’s result.

Is there a purely constructive proof ?

Lovett Meka’12: Yes.Gaussian random walks + Linear Algebra

Page 71: Discrepancy and SDPs

The new algorithm

Maintain a set V of constraints.Initially V = empty.

V will grow over time and contains:i) Some = -1 or +1.ii) For some i,

Algorithm: Do a random Gaussian step in

Let be a basis for . Take

Page 72: Discrepancy and SDPs

Analysis

g is a random vector in

Evolution of Key point:

So, energy rises as long as Disc(Set i): Random walk with

Page 73: Discrepancy and SDPs

Matousek Lower Bound

We saw

Thm (Lovasz Spencer Vesztergombi’86) detlb:

Conjecture: Herdisc O(1) detlbRemark: TU Matrices Herdisc(A) =1, detlb = 1

Page 74: Discrepancy and SDPs

DetlbHoffman: Counterexample.Detlb(A) 2 Palvolgyi’11:

Matousek’11: herdisc(A) O(log n ) detlb.

Other implications:

Page 75: Discrepancy and SDPs

Proof SketchThm (LSV’86): herdisc(A) detlb(A) [Exercise]

Consider polytope P:

Let t be smallest number s.t. for any point tP centered at x contains some point in

Claim: t herdisc(A) (by rounding property)

Same as: If you place tP at integer points, cover .Now, connect Volume of tP to Determinant.

Page 76: Discrepancy and SDPs

Hoffman’s example

T: a k-ary tree of depth k. nodes.

S: All edges out of a node.S’: All leaf to root paths.Both are TU.

Herdisc(= k.

Claim: Pf: Laplace Expansion of determinant.

Page 77: Discrepancy and SDPs

Matousek’s resultThm: Herdisc(A) = O(log n ) detlb(A)

Define vecdisc(A) min s.t.

is feasible.

Define Hervecdisc(A) analogously. SDP algorithm implies

Will show:

Page 78: Discrepancy and SDPs

Recap: LP Duality

Min

There is a solution of value 7.

How do you convince someone that nothing is better?

Min cxAx >= bThere exists a dual witness That certifies that any solution >= LP OPT

Page 79: Discrepancy and SDPs

SDP Duality

If , there exist weights with And such that

for all .

Why is this a witness?

𝑤𝑖

𝑧 𝑗

Page 80: Discrepancy and SDPs

Proof Sketch

As There exists a subset of variables U such that

for all .

Can show this submatrix has large eigenvauesand hence large sub-determinant.

U

Page 81: Discrepancy and SDPs

ConclusionsVarious basic questions remain open in discrepancy.

Algorithmic questions:Conjecture (Matousek’11): disc(A) hervecdisc(A)(would imply tight bound herdisc(A) = O(log m) detlb(A))

Obtain constructive Banaszczyk bound ( ) for Beck Fiala.

Various other non-constructive methods: Counting, topological, fixed points, …What can be made constructive, not so well understood.