TY - GEN
T1 - Optimizing Lossy Compression with Adjacent Snapshots for N-body Simulation Data
AU - Li, Sihuan
AU - Di, Sheng
AU - Liang, Xin
AU - Chen, Zizhong
AU - Cappello, Franck
N1 - Funding Information:
ACKNOWLEDGMENTS This research was supported by the Exascale Computing Project (ECP), Project Number: 17-SC-20-SC, a collaborative effort of two DOE organizations – the Office of Science and the National Nuclear Security Administration, responsible for the planning and preparation of a capable exascale ecosystem, including software, applications, hardware, advanced system engineering and early testbed platforms, to support the nations exascale computing imperative. The material was supported by the U.S. Department of Energy, Office of Science, under contract DE-AC02-06CH11357, and supported by the National Science Foundation under Grant No. 1619253. This research is also supported by NSF Award No. 1513201. We acknowledge the computing resources provided on Bebop, which is operated by the Laboratory Computing Resource Center at Argonne National Laboratory.
Publisher Copyright:
© 2018 IEEE.
PY - 2019/1/22
Y1 - 2019/1/22
N2 - Today's N-body simulations are producing extremely large amounts of data. The Hardware/Hybrid Accelerated Cosmology Code (HACC), for example, may simulate trillions of particles, producing tens of petabytes of data to store in a parallel file system, according to the HACC users. In this paper, we design and implement an efficient, in situ error-bounded lossy compressor to significantly reduce the data size for N-body simulations. Not only can our compressor save significant storage space for N-body simulation researchers, but it can also improve the I/O performance considerably with limited memory and computation overhead. Our contribution is threefold. (1) We propose an efficient data compression model by leveraging the consecutiveness of the cosmological data in both space and time dimensions as well as the physical correlation across different fields. (2) We propose a lightweight, efficient alignment mechanism to align the disordered particles across adjacent snapshots in the simulation, which is a fundamental step in the whole compression procedure. We also optimize the compression quality by exploring best-fit data prediction strategies and optimizing the frequencies of the space-based compression vs. time-based compression. (3) We evaluate our compressor using both a cosmological simulation package and molecular dynamics simulation data - two major categories in the N-body simulation domain. Experiments show that under the same distortion of data, our solution produces up to 43% higher compression ratios on the velocity field and up to 300% higher on the position field than do other state-of-the-art compressors (including SZ, ZFP, NUMARCK, and decimation). With our compressor, the overall I/O time on HACC data is reduced by up to 20% compared with the second-best compressor.
AB - Today's N-body simulations are producing extremely large amounts of data. The Hardware/Hybrid Accelerated Cosmology Code (HACC), for example, may simulate trillions of particles, producing tens of petabytes of data to store in a parallel file system, according to the HACC users. In this paper, we design and implement an efficient, in situ error-bounded lossy compressor to significantly reduce the data size for N-body simulations. Not only can our compressor save significant storage space for N-body simulation researchers, but it can also improve the I/O performance considerably with limited memory and computation overhead. Our contribution is threefold. (1) We propose an efficient data compression model by leveraging the consecutiveness of the cosmological data in both space and time dimensions as well as the physical correlation across different fields. (2) We propose a lightweight, efficient alignment mechanism to align the disordered particles across adjacent snapshots in the simulation, which is a fundamental step in the whole compression procedure. We also optimize the compression quality by exploring best-fit data prediction strategies and optimizing the frequencies of the space-based compression vs. time-based compression. (3) We evaluate our compressor using both a cosmological simulation package and molecular dynamics simulation data - two major categories in the N-body simulation domain. Experiments show that under the same distortion of data, our solution produces up to 43% higher compression ratios on the velocity field and up to 300% higher on the position field than do other state-of-the-art compressors (including SZ, ZFP, NUMARCK, and decimation). With our compressor, the overall I/O time on HACC data is reduced by up to 20% compared with the second-best compressor.
KW - Error-bounded lossy compression
KW - I/O performance
KW - large science data
KW - N-body simulation
UR - http://www.scopus.com/inward/record.url?scp=85062624055&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85062624055&partnerID=8YFLogxK
U2 - 10.1109/BigData.2018.8622101
DO - 10.1109/BigData.2018.8622101
M3 - Conference contribution
AN - SCOPUS:85062624055
T3 - Proceedings - 2018 IEEE International Conference on Big Data, Big Data 2018
SP - 428
EP - 437
BT - Proceedings - 2018 IEEE International Conference on Big Data, Big Data 2018
A2 - Song, Yang
A2 - Liu, Bing
A2 - Lee, Kisung
A2 - Abe, Naoki
A2 - Pu, Calton
A2 - Qiao, Mu
A2 - Ahmed, Nesreen
A2 - Kossmann, Donald
A2 - Saltz, Jeffrey
A2 - Tang, Jiliang
A2 - He, Jingrui
A2 - Liu, Huan
A2 - Hu, Xiaohua
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 IEEE International Conference on Big Data, Big Data 2018
Y2 - 10 December 2018 through 13 December 2018
ER -