fast transforms based on asymptotic expansions of special...
TRANSCRIPT
Fast algorithms based on asymptotic expansions of special functions
Alex Townsend
MIT
Cornell University, 23rd January 2015, CAM Colloquium
Introduction
Two trends in the computing era
Alex Townsend @MIT
Tabulations Software
Logarithms
Gauss quadrature
Special functions
1/22
Introduction
Two trends in the computing era
Alex Townsend @MIT
Classicalanalysis
Numericalanalysis
= ~~
Fast multipole method
Butterfly schemes
Iterative algorithms
Finite element method
Tabulations Software
Logarithms
Gauss quadrature
Special functions
1/22
Introduction
A selection of my research
Alex Townsend @MIT
𝑝(𝑥, 𝑦)
𝑞(𝑥, 𝑦)= 0
[Nakatsukasa, Noferini, & T., 2015]
2/22
𝑓(𝑥, 𝑦) ≈
𝑗=1
𝐾
𝑐𝑗 𝑦 𝑟𝑗(𝑥)
[Trefethen & T., 2014]
Algebraic geometry
Approx. theory
Introduction
A selection of my research
Alex Townsend @MIT
Algebraic geometry
𝑝(𝑥, 𝑦)
𝑞(𝑥, 𝑦)= 0
[Nakatsukasa, Noferini, & T., 2015]
Approx. theory
[Battels & Trefethen, 2004]
𝑓(𝑥, 𝑦) ≈
𝑗=1
𝐾
𝑐𝑗 𝑦 𝑟𝑗(𝑥)
[Trefethen & T., 2014]
2/22
Introduction
Many special functions are trigonometric-like
Alex Townsend @MIT
Trigonometric functionscos(ωx), sin(𝜔𝑥)
Chebyshev polynomialsTn(x)
Legendre polynomialsPn(x)
Bessel functionsJv(z)
Airy functionsAi(x)
Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc.
3/22
Introduction
Many special functions are trigonometric-like
Alex Townsend @MIT
Trigonometric functionscos(ωx), sin(𝜔𝑥)
Chebyshev polynomialsTn(x)
Legendre polynomialsPn(x)
Bessel functionsJv(z)
Airy functionsAi(x)
Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc.
3/22
Introduction
Many special functions are trigonometric-like
Alex Townsend @MIT
Trigonometric functionscos(ωx), sin(𝜔𝑥)
Chebyshev polynomialsTn(x)
Legendre polynomialsPn(x)
Bessel functionsJv(z)
Airy functionsAi(x)
Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc.
3/22
Introduction
Many special functions are trigonometric-like
Alex Townsend @MIT
Trigonometric functionscos(ωx), sin(𝜔𝑥)
Chebyshev polynomialsTn(x)
Legendre polynomialsPn(x)
Bessel functionsJv(z)
Airy functionsAi(x)
Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc.
3/22
Introduction
Many special functions are trigonometric-like
Alex Townsend @MIT
Trigonometric functionscos(ωx), sin(𝜔𝑥)
Chebyshev polynomialsTn(x)
Legendre polynomialsPn(x)
Bessel functionsJv(z)
Airy functionsAi(x)
Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc.
3/22
Introduction
Many special functions are trigonometric-like
Alex Townsend @MIT
Trigonometric functionscos(ωx), sin(𝜔𝑥)
Chebyshev polynomialsTn(x)
Legendre polynomialsPn(x)
Bessel functionsJv(z)
Airy functionsAi(x)
Also, Jacobi polynomials, Hermite polynomials, parabolic cylinder functions, etc.
3/22
Introduction
Asymptotic expansions of special functions
Alex Townsend @MIT
Legendre polynomials
Bessel functions
Thomas Stieltjes
Hermann Hankel
𝑃𝑛(cos 𝜃) ~ 𝐶𝑛
𝑚=0
𝑀−1
ℎ𝑚,𝑛
cos 𝑚 + 𝑛 +12𝜃 − 𝑚 +
12𝜋2
(2 sin 𝜃)𝑚+12
, 𝑛 → ∞
𝐽0 𝑧 ~2
𝜋𝑧(cos 𝑧 −
𝜋
4
𝑚=0
𝑀−1−1 𝑚𝑎2𝑚𝑧2𝑚
− sin(𝑧 −𝜋
4)
𝑚=0
𝑀−1(−1)𝑚𝑎2𝑚+1𝑧2𝑚+1
)
𝑧 → ∞
4/22
Introduction
Numerical pitfalls of an asymptotic expansionist
Alex Townsend @MIT
Fix M. Where is the asymptotic expansion accurate?
𝐽0 𝑧 =2
𝜋𝑧(cos 𝑧 −
𝜋
4
𝑚=0
𝑀−1−1 𝑚𝑎2𝑚𝑧2𝑚
− sin(𝑧 −𝜋
4)
𝑚=0
𝑀−1(−1)𝑚𝑎2𝑚+1𝑧2𝑚+1
) + 𝑅𝑀(𝑧)
5/22
Introduction
Talk outline
Alex Townsend @MIT
Using asymptotics expansions of special functions:
I. Computing Gauss quadrature rules. [Hale & T., 2013], [T., Trodgon, & Olver, 2015]
II. Fast transforms based on asymptotic expansions. [Hale & T., 2014], [T., 2015]
6/22
Part I: Computing Gauss quadrature rules
Alex Townsend @MIT
Part I: Computing Gauss quadrature rules
Pn(x) = Legendre polynomial of degree n
7/22
Part I: Computing Gauss quadrature rules
Definition
Alex Townsend @MIT
n-point quadrature rule: Gauss quadrature rule:
−1
1
𝑓 𝑥 𝑑𝑥 ≈
𝑘=1
𝑛
𝑤𝑘𝑓 𝑥𝑘𝑥𝑘 = 𝑘𝑡ℎ zero of 𝑃𝑛
𝑤𝑘 = 2(1 − 𝑥𝑘2)-1 [𝑃𝑛
′(𝑥𝑘)]-2
1. Exactly integrates polynomials of
degree ≤ 2n - 1. Best possible.
2. Computed by the Golub–Welsch
algorithm. [Golub & Welsch, 1969]
3. In 2004: no scheme for
computing Gauss rules in O(n). [Bailey, Jeyabalan, & Li, 2004]
8/22
Part I: Computing Gauss quadrature rules
Gauss quadrature experiments
Alex Townsend @MIT
Quadrature error Execution time
Qu
ad
ratu
re e
rro
r
Ex
ecu
tio
n t
ime
The Golub–Welsch algorithm is beautiful, but not the state-of-the-art.
There is another way...
n n
9/22
Part I: Computing Gauss quadrature rules
Newton’s method
Alex Townsend @MIT
Newton’s method
Isaac Newton
1. Evaluation scheme for Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
2. Evaluation scheme for 𝑑
𝑑𝜃Pn( 𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/21/2
3. Initial guesses for cos-1(xk)
Solve Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2 = 0, θ ∈ (0, π).
(with asymptotics and a change of variables)
10/22
1. Evaluation scheme for Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
2. Evaluation scheme for 𝑑
𝑑𝜃Pn( 𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2)1/2
3. Initial guesses for cos-1(xk)
Part I: Computing Gauss quadrature rules
Newton’s method… with asymptotics and a change of variables
Alex Townsend @MIT
Newton’s method
Isaac Newton
(with asymptotics and a change of variables)
Pn(x)
Solve Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2 = 0, θ ∈ (0, π).
10/22
Part I: Computing Gauss quadrature rules
Newton’s method… with asymptotics and a change of variables
Alex Townsend @MIT
Newton’s method
Isaac Newton
1. Evaluation scheme for Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
2. Evaluation scheme for 𝑑
𝑑𝜃Pn( 𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2)1/2
3. Initial guesses for cos-1(xk)
Solve Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2 = 0, θ ∈ (0, π).
Pn(x)
(with asymptotics and a change of variables)
10/22
Part I: Computing Gauss quadrature rules
Newton’s method… with asymptotics and a change of variables
Alex Townsend @MIT
Isaac Newton
Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
1. Evaluation scheme for Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
2. Evaluation scheme for 𝑑
𝑑𝜃Pn( 𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
3. Initial guesses for cos-1(xk)
Newton’s method
Solve Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2 = 0, θ ∈ (0, π).
(with asymptotics and a change of variables)
10/22
Part I: Computing Gauss quadrature rules
Newton’s method… with asymptotics and a change of variables
Alex Townsend @MIT
Isaac Newton
Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
1. Evaluation scheme for Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
2. Evaluation scheme for 𝑑
𝑑𝜃Pn( 𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
3. Initial guesses for cos-1(xk)
Newton’s method
Solve Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2 = 0, θ ∈ (0, π).
(with asymptotics and a change of variables)
10/22
Part I: Computing Gauss quadrature rules
Newton’s method… with asymptotics and a change of variables
Alex Townsend @MIT
Isaac Newton
Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
1. Evaluation scheme for Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
2. Evaluation scheme for 𝑑
𝑑𝜃Pn( 𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
3. Initial guesses for cos-1(xk)
Newton’s method
(with asymptotics and a change of variables)
Solve Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2 = 0, θ ∈ (0, π).
10/22
Part I: Computing Gauss quadrature rules
Evaluation scheme for Pn
Alex Townsend @MIT
Interior (in grey region):Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
Boundary:
[Baratella & Gatteschi, 1988]
11/22
𝑃𝑛(cos 𝜃)(sin 𝜃)1/2 ≈ 𝜃
12(𝐽0(𝜌𝜃)
𝑚=0
𝑀−1𝐴𝑚(𝜃)
𝜌2𝑚+ 𝜃𝐽1(𝜌𝜃)
𝑚=0
𝑀−1𝐵𝑚(𝜃)
𝜌2𝑚+1), 𝜌 = 𝑛 +
1
2
𝑃𝑛(cos 𝜃)(sin 𝜃)1/2
≈ 𝐶𝑛
𝑚=0
𝑀−1
ℎ𝑚,𝑛
cos 𝑚 + 𝑛 +12𝜃 − 𝑚 +
12𝜋2
2(2 sin 𝜃)𝑚
Theorem
For 𝜖 > 0, 𝑛 ≥ 200, and θ𝑘𝑖𝑛𝑖𝑡= (𝑘 −
1
2)𝜋
𝑛, Newton converges and the nodes are
computed to an accuracy of 𝒪(𝜖) in 𝒪(𝑛log(1/𝜖)) operations.
Part I: Computing Gauss quadrature rules
Evaluation scheme for Pn
Alex Townsend @MIT
Interior (in grey region):Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
Boundary:
[Baratella & Gatteschi, 1988]
𝑃𝑛(cos 𝜃)(sin 𝜃)1/2 ≈ 𝜃
12(𝐽0(𝜌𝜃)
𝑚=0
𝑀−1𝐴𝑚(𝜃)
𝜌2𝑚+ 𝜃𝐽1(𝜌𝜃)
𝑚=0
𝑀−1𝐵𝑚(𝜃)
𝜌2𝑚+1), 𝜌 = 𝑛 +
1
2
𝑃𝑛(cos 𝜃)(sin 𝜃)1/2
≈ 𝐶𝑛
𝑚=0
𝑀−1
ℎ𝑚,𝑛
cos 𝑚 + 𝑛 +12𝜃 − 𝑚 +
12𝜋2
2(2 sin 𝜃)𝑚
11/22
Theorem
For 𝜖 > 0, 𝑛 ≥ 200, and θ𝑘𝑖𝑛𝑖𝑡= (𝑘 −
1
2)𝜋
𝑛, Newton converges and the nodes are
computed to an accuracy of 𝒪(𝜖) in 𝒪(𝑛log(1/𝜖)) operations.
Part I: Computing Gauss quadrature rules
Evaluation scheme for Pn
Alex Townsend @MIT
Interior (in grey region):Pn(𝑐𝑜𝑠 θ) (𝑠𝑖𝑛 θ)1/2
Boundary:
Theorem
For 𝑛 ≥ 200 and θ𝑘𝑖𝑛𝑖𝑡= (𝑘 −
1
2)𝜋
𝑛, Newton converges to the nodes in 𝒪(𝑛) operations.
[Baratella & Gatteschi, 1988]
11/22
𝑃𝑛(cos 𝜃)(sin 𝜃)1/2 ≈ 𝜃
12(𝐽0(𝜌𝜃)
𝑚=0
𝑀−1𝐴𝑚(𝜃)
𝜌2𝑚+ 𝜃𝐽1(𝜌𝜃)
𝑚=0
𝑀−1𝐵𝑚(𝜃)
𝜌2𝑚+1), 𝜌 = 𝑛 +
1
2
𝑃𝑛(cos 𝜃)(sin 𝜃)1/2
≈ 𝐶𝑛
𝑚=0
𝑀−1
ℎ𝑚,𝑛
cos 𝑚 + 𝑛 +12𝜃 − 𝑚 +
12𝜋2
2(2 sin 𝜃)𝑚
Part I: Computing Gauss quadrature rules
Initial guesses for the nodes
Alex Townsend @MIT
Max error in the initial guesses
There are now better initial guesses. [Bogaert, 2014]
Interior: Use Tricomi’s
initial guesses. [Tricomi, 1950]
Boundary: Use Olver’s
initial guesses. [Olver, 1974]
Ignace Bogaert
Ma
x e
rro
r
12/22
Part I: Computing Gauss quadrature rules
The race to compute high-order Gauss quadrature
Alex Townsend @MIT
“The race to computehigh-order Gauss quadrature”
[T., 2015]
13/22
Part I: Computing Gauss quadrature rules
Applications
Alex Townsend @MIT
1. Psychological: Gauss quadrature
rules are cheap.
2. Experimental mathematics: High
precision quadratures. [Bailey, Jeyabalan,
& Li, 2004]
3. Cosmic microwave background:
Noise removal by L2-projection.
Generically, composite rules converge algebraically.
−11 1
1+1002𝑥2dxQ
ua
dra
ture
err
or
14/22
Part II: Fast transforms based on asymptotic expansions
Alex Townsend @MIT
Part II: Fast transforms based on asymptoticexpansions
𝑷𝑵𝑐 =𝑃0 cos 𝜃0 ⋯ 𝑃𝑁−1 cos 𝜃0⋮ ⋱ ⋮
𝑃0 cos 𝜃𝑁−1 ⋯ 𝑃𝑁−1 cos 𝜃𝑁−1
𝑐1⋮𝑐𝑁−1
, 𝜃𝑘=𝑘𝜋
𝑁 − 1
15/22
𝜃0 𝜃1
Part II: Fast transforms based on asymptotic expansions
Fast transforms as evaluation schemes
Alex Townsend @MIT 16/22
DFT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑧𝑘𝑛 , 𝑧𝑘 = 𝑒
−2𝜋𝑖𝑘/𝑁
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑐𝑜𝑠(𝑛𝜃𝑘) , 𝜃𝑘=
𝑘𝜋
𝑁−1
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑇𝑛(𝑐𝑜𝑠𝜃𝑘)
DLT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑃𝑛(𝑐𝑜𝑠𝜃𝑘)
DHT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝐽𝑣 𝑗𝑣,𝑛𝑟𝑘 , 𝑟𝑘 =
𝑗𝑣,𝑛
𝑗𝑣,𝑁, 𝑗𝑣,𝑛 = 𝑛𝑡ℎ 𝑟𝑜𝑜𝑡 𝑜𝑓 𝐽𝑣 𝑧
𝒛𝟎
𝒛𝟏𝒛𝟐
Part II: Fast transforms based on asymptotic expansions
Fast transforms as evaluation schemes
Alex Townsend @MIT 16/22
DFT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑧𝑘𝑛 , 𝑧𝑘 = 𝑒
−2𝜋𝑖𝑘/𝑁
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑐𝑜𝑠(𝑛𝜃𝑘) , 𝜃𝑘=
𝑘𝜋
𝑁−1
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑇𝑛(𝑐𝑜𝑠𝜃𝑘)
DLT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑃𝑛(𝑐𝑜𝑠𝜃𝑘)
DHT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝐽𝑣 𝑗𝑣,𝑛𝑟𝑘 , 𝑟𝑘 =
𝑗𝑣,𝑛
𝑗𝑣,𝑁, 𝑗𝑣,𝑛 = 𝑛𝑡ℎ 𝑟𝑜𝑜𝑡 𝑜𝑓 𝐽𝑣 𝑧
𝜃0 𝜃1
𝒛𝟎
𝒛𝟏𝒛𝟐
Part II: Fast transforms based on asymptotic expansions
Fast transforms as evaluation schemes
Alex Townsend @MIT 16/22
DFT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑧𝑘𝑛 , 𝑧𝑘 = 𝑒
−2𝜋𝑖𝑘/𝑁
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑐𝑜𝑠(𝑛𝜃𝑘) , 𝜃𝑘=
𝑘𝜋
𝑁−1
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑇𝑛(𝑐𝑜𝑠𝜃𝑘)
DLT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑃𝑛(𝑐𝑜𝑠𝜃𝑘)
DHT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝐽𝑣 𝑗𝑣,𝑛𝑟𝑘 , 𝑟𝑘 =
𝑗𝑣,𝑛
𝑗𝑣,𝑁, 𝑗𝑣,𝑛 = 𝑛𝑡ℎ 𝑟𝑜𝑜𝑡 𝑜𝑓 𝐽𝑣 𝑧
𝒛𝟎
𝒛𝟏𝒛𝟐
𝜃0 𝜃1
Part II: Fast transforms based on asymptotic expansions
Fast transforms as evaluation schemes
Alex Townsend @MIT
DFT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑧𝑘𝑛 , 𝑧𝑘 = 𝑒
−2𝜋𝑖𝑘/𝑁
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑐𝑜𝑠(𝑛𝜃𝑘) , 𝜃𝑘=
𝑘𝜋
𝑁−1
DCT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑇𝑛(𝑐𝑜𝑠𝜃𝑘)
DLT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝑃𝑛(𝑐𝑜𝑠𝜃𝑘)
DHT 𝑓𝑘 = 𝑛=0𝑁−1𝑐𝑛𝐽𝑣 𝑗𝑣,𝑛𝑟𝑘 , 𝑟𝑘 =
𝑗𝑣,𝑛
𝑗𝑣,𝑁, 𝑗𝑣,𝑛 = 𝑛𝑡ℎ 𝑟𝑜𝑜𝑡 𝑜𝑓 𝐽𝑣 𝑧
16/22
𝒛𝟎
𝒛𝟏𝒛𝟐
𝜃0 𝜃1
Part II: Fast transforms based on asymptotic expansions
Related work
Alex Townsend @MIT
Many others: Candel, Cree, Johnson, Mori, Orszag, Suda, Sugihara, Takami, etc.
Fast transforms are very important.
Emmanuel Candes
Boag, Demanet,
Michielssen,
O’Neil, Ying
James Cooley
Ahmed, Tukey, Gauss,
Gentleman, Natarajan,
Rao
Leslie Greengard
Alpert, Barnes, Hut,
Martinsson, Rokhlin,
Mohlenkamp, Tygert
Daniel Potts
Driscoll, Healy,
Steidl, Tasche
17/22
𝑃𝑛(cos 𝜃𝑘) = 𝐶𝑛
𝑚=0
𝑀−1
ℎ𝑚,𝑛
cos 𝑚 + 𝑛 +12𝜃𝑘 − 𝑚 +
12𝜋2
(2 sin 𝜃𝑘)𝑚 + 1/2 + 𝑅𝑀,𝑛(𝜃𝑘)
Part II: Fast transforms based on asymptotic expansions
Asymptotic expansions as a matrix decomposition
Alex Townsend @MIT
The asymptotic expansion
gives a matrix decomposition (sum of diagonally scaled DCTs and DSTs):
𝑷𝑵 𝑷𝑵𝑨𝑺𝒀 𝑹𝑴,𝑵= +
𝑷𝑵=
𝑚=0
𝑀−1
𝐷𝑢𝑚𝑪𝑵𝐷𝐶ℎ𝑚 + 𝐷𝑣𝑚
0 0 00 𝑺𝑵−𝟐 00 0 0
𝐷𝐶ℎ𝑚 + 𝑹𝑴,𝑵
18/22
Part II: Fast transforms based on asymptotic expansions
Be careful and stay safe
Alex Townsend @MIT
Error curve: 𝑅𝑀,𝑛 𝜃𝑘 = 𝜖
Fix M. Where is the asymptotic expansion accurate?
𝑹𝑴,𝑵 =
𝑅𝑀,𝑛(𝜃𝑘) ≤2𝐶𝑛ℎ𝑀,𝑛
(2 sin 𝜃𝑘)𝑀+1/2
19/22
Part II: Fast transforms based on asymptotic expansions
Partitioning and balancing competing costs
Alex Townsend @MIT
Theorem
The matrix-vector product 𝑓 = PN𝑐 can
be computed in 𝒪 (𝑁((log𝑁)2/ log log𝑁)
operations.
α too small
20/22
0 1
α too small
Part II: Fast transforms based on asymptotic expansions
Partitioning and balancing competing costs
Alex Townsend @MIT
Theorem
The matrix-vector product 𝑓 = PN𝑐 can
be computed in 𝒪(𝑁((log𝑁)2/ log log𝑁)
operations.
20/22
0 1
Part II: Fast transforms based on asymptotic expansions
Partitioning and balancing competing costs
Alex Townsend @MIT
The matrix-vector product 𝑓 = 𝐏𝐍𝑐 can
be computed in 𝒪(𝑁((log𝑁)2/ log log𝑁)
operations.
α too small
Theorem
20/22
0 1
Part II: Fast transforms based on asymptotic expansions
Partitioning and balancing competing costs
Alex Townsend @MIT
α too large
Theorem
The matrix-vector product 𝑓 = PN𝑐 can
be computed in 𝒪(𝑁((log𝑁)2/ log log𝑁)
operations.
20/22
0 1
Part II: Fast transforms based on asymptotic expansions
Partitioning and balancing competing costs
Alex Townsend @MIT
α = O (1/ log𝑁)
PN(j)𝑐 PN
EVAL𝑐
Theorem
𝒪 (N (log N)
2
log log 𝑁) 𝒪 (
N (log N)2
log log 𝑁)
The matrix-vector product 𝑓 = PN𝑐 can
be computed in 𝒪(𝑁((log𝑁)2/ log log𝑁)
operations.
20/22
0 1
Part II: Fast transforms based on asymptotic expansions
Numerical results
Alex Townsend @MIT 21/22
105
No precomputation.
0.15
0.1
0.05
0
-0.05
-0.1
-0.15
-0.2
-0.250.3 0.32 0.34
x
Modification of rough signals
−1
1
𝑃𝑛 𝑥 𝑒−𝑖𝜔𝑡𝑑𝑥 = 𝑖𝑚
2𝜋
−𝜔𝐽𝑚+12(−𝜔)
Direct approachUsing asymptotics
10210-3
10-2
10-1
100
103 104
Ex
ecu
tio
n t
ime
N
Future work
Fast spherical harmonic transform
Alex Townsend @MIT 22/22
Spherical harmonic transform:
𝑓 𝜃, 𝜙 =
𝑙=0
𝑁−1
𝑚=−𝑙
𝑙
𝛼𝑙𝑚𝑃𝑙𝑚 (cos 𝜃)𝑒𝑖𝑚𝜙
[Mohlenkamp, 1999]
[Rokhlin & Tygert, 2006]
[Tygert, 2008]
10-2
100
102
104
104E
xec
uti
on
tim
eN
105
PrecomputationFast transformDirect approach
A new generation of fast algorithms with no precomputation.
Existing approaches
103
Alex Townsend @MIT
Thank you
References
Alex Townsend @MIT
• D. H. Bailey, K. Jeyabalan, and X. S. Li, A comparison of three high-precision quadrature schemes, 2005.
• P. Baratella and I. Gatteschi, The bounds for the error term of an asymptotic approximation of Jacobipolynomials, Lecture notes in Mathematics, 1988.
• Z. Battels and L. N. Trefethen, An extension of Matlab to continuous functions and operators, SISC,2004.
• I. Bogaert, Iteration-free computation of Gauss–Legendre quadrature nodes and weights, SISC, 2014.
• G. H. Golub and J. H. Welsch, Calculation of Gauss quadrature rules, Math. Comput., 1969.
• N. Hale and A. Townsend, Fast and accurate computation of Gauss–Legendre and Gauss–Jacobiquadrature nodes and weights, SISC, 2013.
• N. Hale and A. Townsend, A fast, simple, and stable Chebyshev–Legendre transform using an asymptoticformula, SISC, 2014.
• Y. Nakatsukasa, V. Noferini, and A. Townsend, Computing the common zeros of two bivariate functionsvia Bezout resultants, Numerische Mathematik, 2014.
References
Alex Townsend @MIT
• F. W. J. Olver, Asymptotics and Special Functions, Academic Press, New York, 1974.
• A. Townsend and L. N. Trefethen, An extension of Chebfun to two dimensions, SISC, 2013.
• A. Townsend, T. Trogdon, and S. Olver, Fast computation of Gauss quadrature nodes and weights on thewhole real line, to appear in IMA Numer. Anal., 2015.
• A. Townsend, The race to compute high-order Gauss quadrature, SIAM News, 2015.
• A. Townsend, A fast analysis-based discrete Hankel transform using asymptotics expansions, submitted,2015.
• F. G. Tricomi, Sugli zeri dei polinomi sferici ed ultrasferici, Ann. Mat. Pura. Appl., 1950.