### Abstract

Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU net-work, there exists a rational function of degree O(poly log(l/e)) which is e-close, and similarly for any rational function there exists a ReLU network of size ö(poly log(l/e)) which is -close. By contrast, polynomials need degree Q(poly(l/e)) to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.

Original language | English (US) |
---|---|

Title of host publication | 34th International Conference on Machine Learning, ICML 2017 |

Publisher | International Machine Learning Society (IMLS) |

Pages | 5195-5210 |

Number of pages | 16 |

ISBN (Electronic) | 9781510855144 |

State | Published - 2017 |

Event | 34th International Conference on Machine Learning, ICML 2017 - Sydney, Australia Duration: Aug 6 2017 → Aug 11 2017 |

### Publication series

Name | 34th International Conference on Machine Learning, ICML 2017 |
---|---|

Volume | 7 |

### Other

Other | 34th International Conference on Machine Learning, ICML 2017 |
---|---|

Country | Australia |

City | Sydney |

Period | 8/6/17 → 8/11/17 |

### ASJC Scopus subject areas

- Computational Theory and Mathematics
- Human-Computer Interaction
- Software

## Fingerprint Dive into the research topics of 'Neural networks and rational functions'. Together they form a unique fingerprint.

## Cite this

*34th International Conference on Machine Learning, ICML 2017*(pp. 5195-5210). (34th International Conference on Machine Learning, ICML 2017; Vol. 7). International Machine Learning Society (IMLS).