We consider a network of sensors deployed to sense a spatial field for the purposes of parameter estimation. Each sensor makes a sequence of measurements that is corrupted by noise. The estimation problem is to determine the value of a parameter that minimizes a cost that is a function of the measurements and the unknown parameter. The cost function is such that it can be written as the sum of functions (one corresponding to each sensor), each of which is associated with one sensor's measurements. Such a cost function is of interest in regression. We are interested in solving the resulting optimization problem in a distributed and recursive manner. Towards this end, we combine the incremental gradient approach with the Robbins-Monro approximation algorithm to develop the Incremental Robbins-Monro Gradient (IRMG) algorithm. We investigate the convergence of the algorithm under a convexity assumption on the cost function and a stochastic model for the sensor measurements. In particular, we show that if the observations at each are independent and identically distributed, then the IRMG algorithm converges to the optimum solution almost surely as the number of observations goes to infinity. We emphasize that the IRMG algorithm itself requires no information about the stochastic model.