improving compiler optimizations using machine some compiler optimizations are binary optimizations,

Download Improving compiler optimizations using Machine Some compiler optimizations are binary optimizations,

Post on 12-Mar-2020

1 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • IMPROVING COMPILER OPTIMIZATIONS USING MACHINE

    LEARNING

    by

    Sameer Kulkarni

    A dissertation submitted to the Faculty of the University of Delaware in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Computer and Information Sciences

    Summer 2014

    c© 2014 Sameer Kulkarni All Rights Reserved

  • All rights reserved

    INFORMATION TO ALL USERS The quality of this reproduction is dependent upon the quality of the copy submitted.

    In the unlikely event that the author did not send a complete manuscript and there are missing pages, these will be noted. Also, if material had to be removed,

    a note will indicate the deletion.

    Microform Edition © ProQuest LLC. All rights reserved. This work is protected against

    unauthorized copying under Title 17, United States Code

    ProQuest LLC. 789 East Eisenhower Parkway

    P.O. Box 1346 Ann Arbor, MI 48106 - 1346

    UMI 3642324 Published by ProQuest LLC (2014). Copyright in the Dissertation held by the Author.

    UMI Number: 3642324

  • IMPROVING COMPILER OPTIMIZATIONS USING MACHINE

    LEARNING

    by

    Sameer Kulkarni

    Approved: Errol L. Lloyd, Ph.D. Chair of the Department of Computer and Information Sciences

    Approved: Babatunde A. Ogunnaike, Ph.D. Interim Dean of the College of Engineering

    Approved: James G. Richards, Ph.D. Vice Provost for Graduate and Professional Education

  • I certify that I have read this dissertation and that in my opinion it meets the academic and professional standard required by the University as a dissertation for the degree of Doctor of Philosophy.

    Signed: John Cavazos, Ph.D. Professor in charge of dissertation

    I certify that I have read this dissertation and that in my opinion it meets the academic and professional standard required by the University as a dissertation for the degree of Doctor of Philosophy.

    Signed: James Clause, Ph.D. Member of dissertation committee

    I certify that I have read this dissertation and that in my opinion it meets the academic and professional standard required by the University as a dissertation for the degree of Doctor of Philosophy.

    Signed: Xiaoming Li, Ph.D. Member of dissertation committee

    I certify that I have read this dissertation and that in my opinion it meets the academic and professional standard required by the University as a dissertation for the degree of Doctor of Philosophy.

    Signed: Chengmo Yang, Ph.D. Member of dissertation committee

  • I certify that I have read this dissertation and that in my opinion it meets the academic and professional standard required by the University as a dissertation for the degree of Doctor of Philosophy.

    Signed: Mario Wolczko, Ph.D. Member of dissertation committee

  • ACKNOWLEDGEMENTS

    The work done during my years as a graduate student has been inspired, and

    enabled by many people. I am grateful for their support and encouragement and wish

    that I am able to emulate their support and kindness at every opportunity available.

    First, I would like to thank my advisor John Cavazos whose support and confi-

    dence was absolutely instrumental, and evident at every step of this dissertation. His

    patience and kindness provided me with the confidence to continue during my most

    troubling times, in research as well as personal life.

    I want to thank Dr. Mario Wolczko for his inputs and encouragement during

    my summers at Oracle, and the opportunities provided that helped me during this

    research, Dr. Christian Wimmer and Douglas Simon for their, insights and the freedom

    I received from them work on some exciting projects. I am grateful and appreciate the

    opportunity offered to me by Elenita Silverstein at JPMC that helped me immensely

    in writing the final chapter of this thesis.

    I would also like to thank all the present and past members of Cavazos Lab for

    the provided help, offered friendships, and the sense of belonging above all. I would also

    like to thank my friends from before and during grad school, who helped in helping

    me proof read, find and correct a lot of typos and errors. I would like to show my

    gratitude with copious amounts of C2H5OH, when we meet.

    Finally and most importantly I would like to thank my family, my mother Urmila

    Kulkarni and father Col. S. A. Kulkarni, my sister Anagha and my wife Rasika, for

    their support and molding me as a person I am today. I would not be here had it not

    been for your encouragement and support.

    v

  • Dedicated to:

    My late mother,

    wish you were with us today...

    vi

  • TABLE OF CONTENTS

    LIST OF TABLES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix LIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi ABSTRACT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiv

    Chapter

    1 INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

    1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Compiler Tuning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 1.3 Optimization Ordering . . . . . . . . . . . . . . . . . . . . . . . . . . 9 1.4 Optimization Tuning . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 1.5 Structure of the dissertation . . . . . . . . . . . . . . . . . . . . . . . 12

    2 BACKGROUND AND RELATED WORK . . . . . . . . . . . . . . 14

    2.1 Auto-tuning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 2.2 Machine Learning Applied to Compilation . . . . . . . . . . . . . . . 16 2.3 Phase Ordering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 2.4 Method Inlining . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 2.5 Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 2.6 Markov Property . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 2.7 Overview of Training and Deployment . . . . . . . . . . . . . . . . . 26 2.8 Neuro-Evolution Overview . . . . . . . . . . . . . . . . . . . . . . . . 28 2.9 Decision Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 2.10 Fitness Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 2.11 Genetic Algorithms using ECJ . . . . . . . . . . . . . . . . . . . . . . 36

    3 OPTIMIZATION ORDERING . . . . . . . . . . . . . . . . . . . . . . 38

    3.1 Phase-Ordering with Genetic Algorithms . . . . . . . . . . . . . . . . 39 3.2 Issues with Current State-of-the-Art . . . . . . . . . . . . . . . . . . . 41 3.3 Proposed Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

    vii

  • 3.4 Feature Extraction for Phase Ordering . . . . . . . . . . . . . . . . . 44 3.5 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3.6 Optimization Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 3.7 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 3.8 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58

    4 OPTIMIZATION TUNING . . . . . . . . . . . . . . . . . . . . . . . . 69

    4.1 Introduction to Method Inlining . . . . . . . . . . . . . . . . . . . . . 70 4.2 Importance of Method Inlining . . . . . . . . . . . . . . . . . . . . . 72 4.3 Present Inlining Methodology . . . . . . . . . . . . . . . . . . . . . . 73 4.4 Areas with Potential for Improvement . . . . . . . . . . . . . . . . . . 74 4.5 Other Proposed solutions . . . . . . . . . . . . . . . . . . . . . . . . . 79 4.6 Search Space of Method Inlining Settings . . . . . . . . . . . . . . . . 80 4.7 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.8 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86 4.9 Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 4.10 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

    5 OPTIMIZATION SELECTION . . . . . . . . . . . . . . . . . . . . . 104

    5.1 Introduction to Optimization Selection . . . . . . . . . . . . . . . . . 105 5.2 Optimization Levels . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 5.3 Optimization Flag Filtering . . . . . . . . . . . . . . . . . . . . . . . 108 5.4 Benchmark Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . 115 5.5 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 5.6 Dynamic Instruction Counts vs. Execution time . . . . . . . . . . . . 119 5.7 Experimental Setup and Terminology . . . . . . . . . . . . . . . . . . 123 5.8 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126

    6 CONCLUSION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134

    BIBLIOGRAPHY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137

    viii

  • LIST OF TABLES

    1.1 Table calculating the enormity of phase ordering search space . . . 5

    3.1 Source features collected during Phase ordering . . . . . . . . . . . 46

    3.2 Optimizations (and abbreviations) used in present phase ordering experiments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

    3.3 Average training time by GA for each benchmark individually. . . . 51

    3.4 Time taken in days to train the training set, to provide the results in Figure 3.7 . . . . . . . . . . . . . . . . . . . . . . .

Recommended

View more >