Static and dynamic evaluation of data dependence analysis techniques

Paul M. Petersen, David A. Padua

Research output: Contribution to journalReview article

Abstract

Data dependence analysis techniques are the main component of today's strategies for automatic detection of parallelism. Parallelism detection strategies are being incorporated in commercial compilers with increasing frequency because of the widespread use of processors capable of exploiting instruction-level parallelism and the growing importance of multiprocessors. An assessment of the accuracy of data dependence tests is therefore of great importance for compiler writers and researchers. The tests evaluated in this study include the generalized greatest common divisor test, three variants of Banerjee's test, and the Omega test. Their effectiveness was measured with respect to the Perfect Benchmarks and the linear algebra libraries, EISPACK and LAPACK. Two methods were applied, one using only compile-time information for the analysis, and the second using information gathered during program execution. The results indicate that Banerjee's test is for all practical purposes as accurate as the more complex Omega test in detecting parallelism. However, the Omega test is quite effective in proving the existence of dependences, in contrast with Banerjee's test, which can only disprove, or break dependences. The capability of the Omega test of proving dependences could have a significant impact on several compiler algorithms not considered in this study.

Original languageEnglish (US)
Pages (from-to)1121-1132
Number of pages12
JournalIEEE Transactions on Parallel and Distributed Systems
Volume7
Issue number11
DOIs
StatePublished - Dec 1 1996

Fingerprint

Linear algebra

Keywords

  • Automatic parallelization
  • Compiler optimizations
  • Dependence analysis
  • Evaluation of compiler techniques
  • Parallelism detection

ASJC Scopus subject areas

  • Signal Processing
  • Hardware and Architecture
  • Computational Theory and Mathematics

Cite this

Static and dynamic evaluation of data dependence analysis techniques. / Petersen, Paul M.; Padua, David A.

In: IEEE Transactions on Parallel and Distributed Systems, Vol. 7, No. 11, 01.12.1996, p. 1121-1132.

Research output: Contribution to journalReview article

@article{5a43dc4b0ee14cc582fd720211e165b7,
title = "Static and dynamic evaluation of data dependence analysis techniques",
abstract = "Data dependence analysis techniques are the main component of today's strategies for automatic detection of parallelism. Parallelism detection strategies are being incorporated in commercial compilers with increasing frequency because of the widespread use of processors capable of exploiting instruction-level parallelism and the growing importance of multiprocessors. An assessment of the accuracy of data dependence tests is therefore of great importance for compiler writers and researchers. The tests evaluated in this study include the generalized greatest common divisor test, three variants of Banerjee's test, and the Omega test. Their effectiveness was measured with respect to the Perfect Benchmarks and the linear algebra libraries, EISPACK and LAPACK. Two methods were applied, one using only compile-time information for the analysis, and the second using information gathered during program execution. The results indicate that Banerjee's test is for all practical purposes as accurate as the more complex Omega test in detecting parallelism. However, the Omega test is quite effective in proving the existence of dependences, in contrast with Banerjee's test, which can only disprove, or break dependences. The capability of the Omega test of proving dependences could have a significant impact on several compiler algorithms not considered in this study.",
keywords = "Automatic parallelization, Compiler optimizations, Dependence analysis, Evaluation of compiler techniques, Parallelism detection",
author = "Petersen, {Paul M.} and Padua, {David A.}",
year = "1996",
month = "12",
day = "1",
doi = "10.1109/71.544354",
language = "English (US)",
volume = "7",
pages = "1121--1132",
journal = "IEEE Transactions on Parallel and Distributed Systems",
issn = "1045-9219",
publisher = "IEEE Computer Society",
number = "11",

}

TY - JOUR

T1 - Static and dynamic evaluation of data dependence analysis techniques

AU - Petersen, Paul M.

AU - Padua, David A.

PY - 1996/12/1

Y1 - 1996/12/1

N2 - Data dependence analysis techniques are the main component of today's strategies for automatic detection of parallelism. Parallelism detection strategies are being incorporated in commercial compilers with increasing frequency because of the widespread use of processors capable of exploiting instruction-level parallelism and the growing importance of multiprocessors. An assessment of the accuracy of data dependence tests is therefore of great importance for compiler writers and researchers. The tests evaluated in this study include the generalized greatest common divisor test, three variants of Banerjee's test, and the Omega test. Their effectiveness was measured with respect to the Perfect Benchmarks and the linear algebra libraries, EISPACK and LAPACK. Two methods were applied, one using only compile-time information for the analysis, and the second using information gathered during program execution. The results indicate that Banerjee's test is for all practical purposes as accurate as the more complex Omega test in detecting parallelism. However, the Omega test is quite effective in proving the existence of dependences, in contrast with Banerjee's test, which can only disprove, or break dependences. The capability of the Omega test of proving dependences could have a significant impact on several compiler algorithms not considered in this study.

AB - Data dependence analysis techniques are the main component of today's strategies for automatic detection of parallelism. Parallelism detection strategies are being incorporated in commercial compilers with increasing frequency because of the widespread use of processors capable of exploiting instruction-level parallelism and the growing importance of multiprocessors. An assessment of the accuracy of data dependence tests is therefore of great importance for compiler writers and researchers. The tests evaluated in this study include the generalized greatest common divisor test, three variants of Banerjee's test, and the Omega test. Their effectiveness was measured with respect to the Perfect Benchmarks and the linear algebra libraries, EISPACK and LAPACK. Two methods were applied, one using only compile-time information for the analysis, and the second using information gathered during program execution. The results indicate that Banerjee's test is for all practical purposes as accurate as the more complex Omega test in detecting parallelism. However, the Omega test is quite effective in proving the existence of dependences, in contrast with Banerjee's test, which can only disprove, or break dependences. The capability of the Omega test of proving dependences could have a significant impact on several compiler algorithms not considered in this study.

KW - Automatic parallelization

KW - Compiler optimizations

KW - Dependence analysis

KW - Evaluation of compiler techniques

KW - Parallelism detection

UR - http://www.scopus.com/inward/record.url?scp=0030286827&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030286827&partnerID=8YFLogxK

U2 - 10.1109/71.544354

DO - 10.1109/71.544354

M3 - Review article

AN - SCOPUS:0030286827

VL - 7

SP - 1121

EP - 1132

JO - IEEE Transactions on Parallel and Distributed Systems

JF - IEEE Transactions on Parallel and Distributed Systems

SN - 1045-9219

IS - 11

ER -