In this paper we present some computer simulation results on the band-limited signal extrapolation problem. First, the performances of several existing algorithms are compared for the noise-free case. We then describe some modifications of these algorithms for computing the extrapolation when the given signal is contaminated with noise. Computer simulation results for both the noiseless and noisy cases are included. From these results, the following preliminary conclusion can be drawn: Two-step algorithms appear to give better reconstructions and require less computing time than the iterative algorithms considered in this paper.
ASJC Scopus subject areas
- Atomic and Molecular Physics, and Optics
- Engineering (miscellaneous)
- Electrical and Electronic Engineering