TY - GEN
T1 - The power of slightly more than one sample in randomized load balancing
AU - Ying, Lei
AU - Srikant, Rayadurgam
AU - Kang, Xiaohan
PY - 2015/8/21
Y1 - 2015/8/21
N2 - In many computing and networking applications, arriving tasks have to be routed to one of many servers, with the goal of minimizing queueing delays. When the number of processors is very large, a popular routing algorithm works as follows: select two servers at random and route an arriving task to the least loaded of the two. It is well-known that this algorithm dramatically reduces queueing delays compared to an algorithm which routes to a single randomly selected server. In recent cloud computing applications, it has been observed that even sampling two queues per arriving task can be expensive and can even increase delays due to messaging overhead. So there is an interest in reducing the number of sampled queues per arriving task. In this paper, we show that the number of sampled queues can be dramatically reduced by using the fact that tasks arrive in batches (called jobs). In particular, we sample a subset of the queues such that the size of the subset is slightly larger than the batch size (thus, on average, we only sample slightly more than one queue per task). Once a random subset of the queues is sampled, we propose a new load balancing method called batch-filling to attempt to equalize the load among the sampled servers. We show that our algorithm dramatically reduces the sample complexity compared to previously proposed algorithms.
AB - In many computing and networking applications, arriving tasks have to be routed to one of many servers, with the goal of minimizing queueing delays. When the number of processors is very large, a popular routing algorithm works as follows: select two servers at random and route an arriving task to the least loaded of the two. It is well-known that this algorithm dramatically reduces queueing delays compared to an algorithm which routes to a single randomly selected server. In recent cloud computing applications, it has been observed that even sampling two queues per arriving task can be expensive and can even increase delays due to messaging overhead. So there is an interest in reducing the number of sampled queues per arriving task. In this paper, we show that the number of sampled queues can be dramatically reduced by using the fact that tasks arrive in batches (called jobs). In particular, we sample a subset of the queues such that the size of the subset is slightly larger than the batch size (thus, on average, we only sample slightly more than one queue per task). Once a random subset of the queues is sampled, we propose a new load balancing method called batch-filling to attempt to equalize the load among the sampled servers. We show that our algorithm dramatically reduces the sample complexity compared to previously proposed algorithms.
UR - http://www.scopus.com/inward/record.url?scp=84954235294&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84954235294&partnerID=8YFLogxK
U2 - 10.1109/INFOCOM.2015.7218487
DO - 10.1109/INFOCOM.2015.7218487
M3 - Conference contribution
AN - SCOPUS:84954235294
T3 - Proceedings - IEEE INFOCOM
SP - 1131
EP - 1139
BT - 2015 IEEE Conference on Computer Communications, IEEE INFOCOM 2015
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 34th IEEE Annual Conference on Computer Communications and Networks, IEEE INFOCOM 2015
Y2 - 26 April 2015 through 1 May 2015
ER -