### Abstract

We consider the modified conjugate gradient procedure for solving Ax = b in which the approximation space is based upon the Krylov space associated with A^{1/p} and b, for any integer p. For the square-root MCG (p = 2) we establish a sharpened bound for the error at each iteration via Chebyshev polynomials in √A. We discuss the implications of the quickly accumulating effect of an error in √A b in the initial stage, and find an error bound even in the presence of such accumulating errors. Although this accumulation of errors may limit the usefulness of this method when √A b is unknown, it may still be successfully applied to a variety of small, "almost-SPD" problems, and can be used to jump-start the conjugate gradient method. Finally, we verify these theoretical results with numerical tests.

Original language | English (US) |
---|---|

Pages (from-to) | 441-456 |

Number of pages | 16 |

Journal | Journal of Scientific Computing |

Volume | 15 |

Issue number | 4 |

DOIs | |

State | Published - Dec 1 2001 |

Externally published | Yes |

### Keywords

- Conjugate gradient method
- Convergence rate
- Krylov space
- Modified conjugate gradient method
- Stability

### ASJC Scopus subject areas

- Software
- Theoretical Computer Science
- Numerical Analysis
- Engineering(all)
- Computational Theory and Mathematics
- Computational Mathematics
- Applied Mathematics

## Fingerprint Dive into the research topics of 'Solving Ax = b using a modified conjugate gradient method based on roots of A'. Together they form a unique fingerprint.

## Cite this

*Journal of Scientific Computing*,

*15*(4), 441-456. https://doi.org/10.1023/A:1011132730828