We investigate a model of polynomial-time concept prediction which is a relaxation of the distribution-independent model of concept learning due to Valiant. Prediction-preserving reductions are defined and are shown to be effective tools for comparing the relative difficulty of solving various prediction problems. A number of prediction-preserving reductions are given. For example, if deterministic finite automata are polynomially predictable, then so are all Boolean formulas. We develop a complexity theory for prediction problems that parallels standard complexity theory. It is shown that certain problems of concept prediction are "prediction-complete" for a complexity class-a polynomial time algorithm for the prediction problem would imply that all languages in the complexity class are polynomially predictable. For example, polynomial-time prediction of deterministic finite automata implies the polynomial predictability of all languages in the class LOG (deterministic logspace). Similar natural prediction-complete problems are given for the standard complexity classes NC1, NLOG, LOGCFL, and P. Showing that a prediction problem is prediction-complete for any of these classes provides varying degrees of evidence that no efficient prediction algorithm exists for the problem. Based on very weak cryptographic assumptions, we establish hardness results for prediction of Boolean circuits and other prediction problems that are prediction-complete for P. The recent related resuts of Kearns and Valiant are discussed, which show that Boolean formulas and DFAs are not polynomially predictable based on the assumed intractability of computing specific cryptographic functions.
ASJC Scopus subject areas
- Theoretical Computer Science
- Applied Mathematics
- Computer Science(all)
- Computer Networks and Communications
- Computational Theory and Mathematics