TY - GEN
T1 - Densest Subgraph
T2 - 33rd Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2022
AU - Chekuri, Chandra
AU - Quanrud, Kent
AU - Torres, Manuel R.
N1 - Publisher Copyright:
Copyright © 2022 by SIAM Unauthorized reproduction of this article is prohibited.
PY - 2022
Y1 - 2022
N2 - The densest subgraph problem in a graph (DSG), in the simplest form, is the following. Given an undirected graph G = (V;E) find a subset S V of vertices that maximizes the ratio jE(S)j=jSj where E(S) is the set of edges with both endpoints in S. DSG and several of its variants are well-studied in theory and practice and have many applications in data mining and network analysis. In this paper we study fast algorithms and structural aspects of DSG via the lens of supermodularity. For this we consider the densest supermodular subset problem (DSS): given a non-negative supermodular function f : 2V ! R+, maximize f(S)=jSj. For DSG we describe a simple flow-based algorithm that outputs a (1)-approximation in deterministic ~O (m=) time where m is the number of edges. Our algorithm is the first to have a near-linear dependence on m and 1= and improves previous methods based on an LP relaxation. It generalizes to hypergraphs, and also yields a faster algorithm for directed DSG. Greedy peeling algorithms have been very popular for DSG and several variants due to their efficiency, empirical performance, and worst-case approximation guarantees. We describe a simple peeling algorithm for DSS and analyze its approximation guarantee in a fashion that unifies several existing results. Boob et al. [12] developed an iterative peeling algorithm for DSG which appears to work very well in practice, and made a conjecture about its convergence to optimality. We affirmatively answer their conjecture, and in fact prove that a natural generalization of their algorithm converges to a (1)-approximation for any supermodular function f; the key to our proof is to consider an LP formulation that is derived via the Lovász extension of a supermodular function. For DSG the bound on the number of iterations we prove is O(ln jV j1 2 ) where is the maximum degree and is the optimum value. Our work suggests that iterative peeling can be an effective heuristic for several objectives considered in the literature. Finally, we show that the 2-approximation for densest-at-least-k subgraph [37] extends to the supermodular setting. We also give a unified analysis of the peeling algorithm for this problem, and via this analysis derive an approximation guarantee for a generalization of DSS to maximize f(S)=g(jSj) for a concave function g.
AB - The densest subgraph problem in a graph (DSG), in the simplest form, is the following. Given an undirected graph G = (V;E) find a subset S V of vertices that maximizes the ratio jE(S)j=jSj where E(S) is the set of edges with both endpoints in S. DSG and several of its variants are well-studied in theory and practice and have many applications in data mining and network analysis. In this paper we study fast algorithms and structural aspects of DSG via the lens of supermodularity. For this we consider the densest supermodular subset problem (DSS): given a non-negative supermodular function f : 2V ! R+, maximize f(S)=jSj. For DSG we describe a simple flow-based algorithm that outputs a (1)-approximation in deterministic ~O (m=) time where m is the number of edges. Our algorithm is the first to have a near-linear dependence on m and 1= and improves previous methods based on an LP relaxation. It generalizes to hypergraphs, and also yields a faster algorithm for directed DSG. Greedy peeling algorithms have been very popular for DSG and several variants due to their efficiency, empirical performance, and worst-case approximation guarantees. We describe a simple peeling algorithm for DSS and analyze its approximation guarantee in a fashion that unifies several existing results. Boob et al. [12] developed an iterative peeling algorithm for DSG which appears to work very well in practice, and made a conjecture about its convergence to optimality. We affirmatively answer their conjecture, and in fact prove that a natural generalization of their algorithm converges to a (1)-approximation for any supermodular function f; the key to our proof is to consider an LP formulation that is derived via the Lovász extension of a supermodular function. For DSG the bound on the number of iterations we prove is O(ln jV j1 2 ) where is the maximum degree and is the optimum value. Our work suggests that iterative peeling can be an effective heuristic for several objectives considered in the literature. Finally, we show that the 2-approximation for densest-at-least-k subgraph [37] extends to the supermodular setting. We also give a unified analysis of the peeling algorithm for this problem, and via this analysis derive an approximation guarantee for a generalization of DSS to maximize f(S)=g(jSj) for a concave function g.
UR - http://www.scopus.com/inward/record.url?scp=85121411765&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85121411765&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85121411765
T3 - Proceedings of the Annual ACM-SIAM Symposium on Discrete Algorithms
SP - 1531
EP - 1555
BT - ACM-SIAM Symposium on Discrete Algorithms, SODA 2022
PB - Association for Computing Machinery
Y2 - 9 January 2022 through 12 January 2022
ER -