Two sides of a coin: Optimizing the schedule of map reduce jobs to minimize their makespan and improve cluster performance

Abhishek Verma, Ludmila Cherkasova, Roy H. Campbell

Research output: Contribution to journalArticlepeer-review

Abstract

Large-scale Hadoop clusters with their data-intensive, MapReduce style applications, that routinely process petabytes of unstructured and semi-structured data, represent a new entity in the changing landscape of modern data centers. A key challenge is to increase the utilization of these MapReduce clusters. For a set of production jobs that are executed periodically on a new data, we can perform an offline analysis for evaluating performance benefits of different optimization techniques. In this work, we consider a subset of the production workload that consists of MapReduce jobs with no dependencies. We observe that the order in which these jobs are executed can have a significant impact on their overall completion time and the cluster resource utilization. Our goal is to automate the design of a job schedule that minimizes the completion time (makespan) of such a set of MapReduce jobs. We introduce a simple abstraction where each MapReduce job is represented as a pair of map and reduce stage durations. This representation enables us to apply the classic Johnson algorithm that was designed for building an optimal two-stage job schedule. Simulations performed over a set of realistic workloads demonstrate that 10%-25% makespan improvements are achievable by simply processing the jobs in the right order. However, in some cases, the simplified abstraction assumed by Johnson's algorithm may lead to a suboptimal job schedule. We design a novel heuristic, called BalancedPools, that significantly improves Johnson's schedule results (up to 15%-38%), exactly in the situations when it produces suboptimal makespan. The results of our simulation study are validated through experiments on a 66- node Hadoop cluster.

Original languageEnglish (US)
JournalHP Laboratories Technical Report
Issue number127
StatePublished - 2012

Keywords

  • Batch workloads
  • Hadoop
  • MapReduce
  • Minimized makespan
  • Optimized schedule

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Two sides of a coin: Optimizing the schedule of map reduce jobs to minimize their makespan and improve cluster performance'. Together they form a unique fingerprint.

Cite this