We establish the first nontrivial lower bounds on time-space tradeoffs for the selection problem. We prove that any comparison-based randomized algorithm for finding the median requires Ω (n log logS n) expected time in the RAM model (or more generally in the comparison branching program model), if we have S bits of extra space besides the read-only input array. This bound is tight for all S ≫ log n, and remains true even if the array is given in a random order. Our result thus answers a 16-year-old question of Munro and Raman, and also complements recent lower bounds that are restricted to sequential access, as in the multi-pass streaming model [Chakrabarti et al., SODA 2008]. We also prove that any comparison-based, deterministic, multi-pass streaming algorithm for finding the median requires Ω(n log*(n/s) + n log s n) worst-case time (in scanning plus comparisons), if we have s cells of space. This bound is also tight for all s ≫ log2 n. We get deterministic lower bounds for I/O-efficient algorithms as well. All proofs in this paper involve "elementary" techniques only.