generative adversarial networks (gans) - sciencesconf.org · generative adversarial networks (gans)...

86
Generative Adversarial Networks (GANs) Anirban Mukhopadhyay TU Darmstadt, Germany

Upload: others

Post on 25-May-2020

44 views

Category:

Documents


0 download

TRANSCRIPT

Generative Adversarial

Networks (GANs)Anirban Mukhopadhyay

TU Darmstadt, Germany

Introduction

Generative vs. Discriminative

2

Introduction

Generative vs. Discriminative

Generating “realistic-looking” images –

one step closer to understanding it

3

GAN Results

4

© CycleGAN

© SRGAN

© Karras2018

What is in it for me?

MR to CT Reconstruction Anomaly Detection

5

©Schlegl 2017©Wolterink 2017

Proxy for training data

Costly annotation

Imbalance

What is in it for me?

6

©Schlegl 2017

©Wolterink 2017

Proxy for training data

Costly annotation

Imbalance

Similarity metric

Discriminator

What is in it for me?

7

©Schlegl 2017

©Wolterink 2017

Proxy for training data

Costly annotation

Imbalance

Similarity metric

Discriminator

Domain Shift

Adversarial training

What is in it for me?

8

©Schlegl 2017

©Wolterink 2017

Outline

Theory

Key GANs

Medical Applications

Adversarial Learning

Limitations of GAN

Summary

9

Tidying Up GAN – the Marie Kondo way

Outline

Theory

Key GANs

Medical Applications

Adversarial Learning

Limitations of GAN

Summary

10

Tidying Up GAN – the Marie Kondo way

Closet

Kitchen

Emotional

Theory

11

Theory

UNSUPERVISED Learning

12

z GImage

G(z)

Theory

UNSUPERVISED Learning

Perplexity

pdf for the generated distribution

13

z GImage

G(z)

Critique G – Calculating Perplexity

Theory

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

Idea 3: Game of many moves

14

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

15

z GImage

G(z)

Similarity of Preal and Psynth

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

16

z GImage

G(z)

Similarity of Preal and Psynth

Deep Net D – maps images to [0,1]

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

17

z GImage

G(z)

Similarity of Preal and Psynth

Deep Net D – maps images to [0,1]

Ex[D(x)] is high if xϵ Preal

Ex[D(x)] is low if xϵ Psynth

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

18

z GImage

G(z)

Similarity of Preal and Psynth

Deep Net D – maps images to [0,1]

Ex[D(x)] is high if xϵ Preal

Ex[D(x)] is low if xϵ Psynth

Train using Backpropagation

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

19

z GImage

G(z)

Goal of generator G:

Ez[D(G(z))] is as high as possible

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

20

z GImage

G(z)

Goal of generator G:

Ez[D(G(z))] is as high as possible

Fooling Discriminator

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

21

z GImage

G(z)

Goal of generator G:

Ez[D(G(z))] is as high as possible

Fooling Discriminator

Backpropagation through D(G(.))

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

Idea 3: Game of many moves

22

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

Idea 3: Game of many moves

23

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

Idea 3: Game of many moves

24

For Goodfellow 2014

f(x) = log(x)

Theory

Generative vs. Discriminative

UNSUPERVISED Learning

Perplexity

Idea 1: Sidestep perplexity with deep nets

Idea 2: Gradient feedback from discriminator

Idea 3: Game of many moves

25

For Goodfellow 2014

f(x) = log(x)

Derivative of log(x) = 1/x

Training sensitive to instances

that D finds awful

Understanding Key GANs

Engineering Recipe

© giphy.com

Theory Tidy GANs

26

Understanding Key GANs

Engineering Recipe

I/P, O/P

Architecture

Loss Function

27

Understanding Key GANs

Engineering Recipe

I/P, O/P

Architecture

Loss Function

DC-GAN

C-GAN

Cycle-GAN

28

Deep Convolutional GAN (DC-GAN)

Unsupervised

Representation Learning

29

Deep Convolutional GAN (DC-GAN)

Unsupervised

Representation Learning

Latent space Interpolation

30

Deep Convolutional GAN

I/P: Z (100-D multivariate Gaussian)

O/P: Image

31

DC-GAN

I/P: Z (100-D multivariate Gaussian)

O/P: Image

Architecture:

32

DC-GAN

I/P: Z

O/P: Image

Architecture:

33

DC-GAN

I/P: Z

O/P: Image

Architecture:

Loss Function: Same as Goodfellow 2014

34

Conditional GAN (C-GAN)

How to bring in some supervision?

35

0

1

2

3

4

5

6

7

8

9

Conditional GAN (C-GAN)

I/P: Z, Condition

O/P: Image

36

0,1,2,…

C-GAN

I/P: Z, Condition

O/P: Image

Architecture:

37

C-GAN

I/P: Z, Condition(c)

O/P: Image

Architecture:

Loss Function:

38

Cycle-GAN

How to incorporate unpaired images for style/ domain transfer?

39

©Cycle-GAN

Cycle-GAN

I/P: Image (Domain X)

O/P: Image (Domain Y)

40

UN-PAIRED

Cycle-GAN

I/P: Image (Domain X)

O/P: Image (Domain Y)

Architecture:

41

Cycle-GAN

I/P: Image (Domain X)

O/P: Image (Domain Y)

Architecture:

42

Cycle-GAN

I/P: Image (Domain X)

O/P: Image (Domain Y)

Architecture:

Loss Function

43

Cycle-GAN

I/P: Image (Domain X)

O/P: Image (Domain Y)

Architecture:

Loss Function

44

Engineering Recipe Summary

45

GANs I/P O/P Architect. Loss Note

DC-GAN z Img GAN Unsup.

C-GAN z,c Img Modif.

GAN

Cond.

Supervis.

Cycle-GAN Img

(X)

Img

(Y)

Cycle

Loss

Style

Transfer

Medical Applications

46

© popsugar

Medical Applications

Review article

77 papers are reviewed

Till end of 2018

Incl. MICCAI, MiDL, ISBI, TMI,

MedIA etc.

47

Medical Applications

Review article

77 papers are reviewed

Till end of 2018

Incl. MICCAI, MiDL, ISBI, TMI,

MedIA etc.

Mostly applied in

Synthesis

Segmentation

48

https://arxiv.org/abs/1809.06222

Medical Applications

Review article

77 papers are reviewed

Till end of 2018

Incl. MICCAI, MiDL, ISBI, TMI,

MedIA etc.

Mostly applied in

Synthesis

Segmentation

Pattern

Modify Architecture

Modify Loss

49

https://arxiv.org/abs/1809.06222

Medical Applications

Review article

77 papers are reviewed

Till end of 2018

Incl. MICCAI, MiDL, ISBI, TMI,

MedIA etc.

Mostly applied in

Synthesis

Segmentation

Pattern

Modify Architecture

Modify Loss

Re-apply the recipe

50

https://arxiv.org/abs/1809.06222

Synthesis - Unsupervised

Discriminating Lung Nodules

Benign

Malign

51

©Chuquicusma 2018

Synthesis - Unsupervised

Discriminating Lung Nodules

Benign

Malign

Unsupervised synthesis

Modify DC-GAN

52

©Chuquicusma 2018

Synthesis - Unsupervised

Discriminating Lung Nodules

Benign

Malign

Unsupervised synthesis

Modify DC-GAN

Visual Turing Test

2 radiologists

53

©Chuquicusma 2018

Synthesis - Unsupervised

I/P: Z

O/P: Image (64X64X3)

Architecture:

Loss Function: Same as Goodfellow 2014

54

Synthesis - Unsupervised

I/P: Z

O/P: Lung Nodule image (56X56X1)

Architecture:

Loss Function: Same as Goodfellow 2014

55

Synthesis - Unsupervised

56

©Chuquicusma 2018

Synthesis - Unsupervised

57

©Chuquicusma 2018

G

D

Synthesis - Supervised

Radiotherapy treatment planning

MR: Segmentation of tumor and organs

CT: Dose planning

58

©Wolterink 2017

Synthesis - Supervised

Radiotherapy treatment planning

MR: Segmentation of tumor and organs

CT: Dose planning

MR-only radiotherapy treatment

planning

Synthesize CT

59

©Wolterink 2017

Synthesis - Supervised

Radiotherapy treatment planning

MR: Segmentation of tumor and organs

CT: Dose planning

MR-only radiotherapy treatment

planning

Synthesize CT

Re-purpose Cycle-GAN

60

©Wolterink 2017

Synthesis - Supervised

I/P: Image (Domain X)

O/P: Image (Domain Y)

Architecture:

Loss Function: Cycle Loss

61

Synthesis - Supervised

I/P: MR

O/P: CT

Architecture:

Loss Function: Cycle Loss (Sum of L1 norms

at MR and CT)

62

Synthesis - Supervised

63

©Wolterink 2017

Domain Adaptation - Adversarial Learning

Deep Learning Segmentation

Performs well in same domain

Degrades with new domain

64

©Kamnitsas 2017

Domain Adaptation - Adversarial Learning

Deep Learning Segmentation

Performs well in same domain

Degrades with new domain

Traumatic Brain Injury

Segment bleeding

65

©Kamnitsas 2017

Domain Adaptation - Adversarial Learning

Deep Learning Segmentation

Performs well in same domain

Degrades with new domain

Traumatic Brain Injury

Segment bleeding

Learn domain invariant

features

Auxiliary task - Adversarial

66

©Kamnitsas 2017

Domain Adaptation - Adversarial Learning

Different MR sequences

Source Domain

Target Domain

Typical Deep Learning fails

67

©Kamnitsas 2017

Domain Adaptation - Adversarial Learning

Different MR sequences

Source Domain

Target Domain

Typical Deep Learning fails

68

G Segmenter

©Kamnitsas 2017

Domain Adaptation - Adversarial Learning

Different MR sequences

Source Domain

Target Domain

Typical Deep Learning fails

69

©Kamnitsas 2017

Domain Adaptation - Adversarial Learning

70

©Kamnitsas 2017

Limitations of GAN

71

© slate.com

Limitations of GAN

Numerical Instability

Mode Collapse

72

Extreme example

Constant curl vector

Non-conservative

Arises naturally in zero-sum game

Follow arrow like simultaneous

gradient ascent

Though has equilibrium at (0,0)

Initial Solution

Numerics of GAN (Reading List)

© inFERENCe

Limitations of GAN

Numerical Instability

Mode Collapse

Evaluation

Metrics

74

Limitations of GAN - Practical

Counting

75

Medical Equivalent

Cell Images

Limitations of GAN - Practical

Counting Perspective

76

Medical Equivalent

Cell Images

Medical Equivalent

Cross domain synthesis

Limitations of GAN - Practical

Counting Global StructurePerspective

77

Medical Equivalent

Cell Images

Medical Equivalent

Cross domain synthesis

Medical Equivalent

Reconstruction

Reading List

Engineering

DC-GAN

C-GAN

CycleGAN

GAN applications

Living Review

Wolterink 2017

Kamnitsas 2017

Chuquicusma 2018

Theory

Numerics of GANs

Are GANs Created Equal?

f-GANs

Blogs

Off the convex Path

GAN Open Problems

MICCAI 2019 Tutorial

Lecturers: Me, J. Wolterink, K.

Kamnitsas

78

Review: GANs for Medical Image Analysis

Summary

GANs – Unsupervised generative models with adversarial twist

When done correctly

Realistic-looking images of unprecedented quality

Medical Imaging

Synthesis - proxy for training data

Domain shift

Issues

Numerical Instability

Evaluation metric

79

https://arxiv.org/abs/1809.06222

Thank You!

80

Backup Slides

81

DC-GAN

DC-GAN

Recipe

Replace pooling layers with strided convolutions (discriminator) and

fractional-strided convolutions (generator).

DC-GAN

Recipe

Replace pooling layers with strided convolutions (discriminator) and

fractional-strided convolutions (generator).

Use batchnorm

Use LeakyReLU in discriminator

Synthesis

Unconditnl. (DC-GAN)

Data Simulation

Class Imbalance

Data Augmentation

Prostate Lesions

Retina Patches

Skin Lesions

Conditnl. (C-/ Cycle-GAN)

CT from MR

PET from CT/ MRI

Stain Normalization

85

DC-GAN

Recipe

Replace pooling layers with strided convolutions (discriminator) and

fractional-strided convolutions (generator).

Use batchnorm

Use LeakyReLU in discriminator

Use ReLU in generator for all layers except output, which uses Tanh.