Optimal broadcast and summation in the LogP model

Richard M. Karp, Abbijit Sahay, Eunice E. Santos, Klaus Erik Schauser

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In many distributed-memory parallel computers the only built-in communication primitive is point-to-point message transmission, and more powerful operations such as broadcast and synchronization must be realized using this primitive. Within the LogP model of parallel computation we present algorithms that yield optimal communication schedules for several broadcast and synchronization operations. Most of our algorithms are the absolutely best possible in that not even the constant factors can be improved upon. For one particular broadcast problem, called continuous broadcast, the optimaliry of our algorithm is not yet completely proven, although proofs have been achieved for a certain range of parameters. We also devise an optimal algorithm for summing or, more generally, applying a non-commutative associative binary operator to a set of operands.

Original languageEnglish (US)
Title of host publicationProceedings of the 5th Annual ACM Symposium on Parallel Algorithms and Architectures, SPAA 1993
PublisherAssociation for Computing Machinery, Inc
Pages142-153
Number of pages12
ISBN (Electronic)0897915992, 9780897915991
DOIs
StatePublished - Aug 1 1993
Externally publishedYes
Event5th Annual ACM Symposium on Parallel Algorithms and Architectures, SPAA 1993 - Velen, Germany
Duration: Jun 30 1993Jul 2 1993

Publication series

NameProceedings of the 5th Annual ACM Symposium on Parallel Algorithms and Architectures, SPAA 1993

Other

Other5th Annual ACM Symposium on Parallel Algorithms and Architectures, SPAA 1993
Country/TerritoryGermany
CityVelen
Period6/30/937/2/93

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Optimal broadcast and summation in the LogP model'. Together they form a unique fingerprint.

Cite this