A hardware acceleration technique for gradient descent and conjugate gradient

David Kesler, Biplab Deka, Rakesh Kumar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Application Robustification, a promising approach for reducing processor power, converts applications into numerical optimization problems and solves them using gradient descent and conjugate gradient algorithms [1]. The improvement in robustness, however, comes at the expense of performance when compared to the baseline non-iterative versions of these applications. To mitigate the performance loss from robustification, we present the design of a hardware accelerator and corresponding software support that accelerate gradient descent and conjugate gradient based iterative implementation of applications. Unlike traditional accelerators, our design accelerates different types of linear algebra operations found in many algorithms and is capable of efficiently handling sparse matrices that arise in applications such as graph matching. We show that the proposed accelerator can provide significant speedups for iterative versions of several applications and that for some applications such as least squares, it can substantially improve the computation time as compared to the baseline non-iterative implementation.

Original languageEnglish (US)
Title of host publicationProceedings of the 2011 IEEE 9th Symposium on Application Specific Processors, SASP 2011
Pages94-101
Number of pages8
DOIs
StatePublished - 2011
Event2011 IEEE 9th Symposium on Application Specific Processors, SASP 2011 - San Diego, CA, United States
Duration: Jun 5 2011Jun 6 2011

Publication series

NameProceedings of the 2011 IEEE 9th Symposium on Application Specific Processors, SASP 2011

Other

Other2011 IEEE 9th Symposium on Application Specific Processors, SASP 2011
Country/TerritoryUnited States
CitySan Diego, CA
Period6/5/116/6/11

ASJC Scopus subject areas

  • Hardware and Architecture
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'A hardware acceleration technique for gradient descent and conjugate gradient'. Together they form a unique fingerprint.

Cite this