TY - JOUR
T1 - Detection, instance segmentation, and classification for astronomical surveys with deep learning (DeepDISC): Detectron2 implementation and demonstration with hyper suprime-cam data
AU - Merz, Grant
AU - Liu, Yichen
AU - Burke, Colin J
AU - Aleo, Patrick D
AU - Liu, Xin
AU - Kind, Matias Carrasco
AU - Kindratenko, Volodymyr
AU - Liu, Yufeng
N1 - The HSC collaboration includes the astronomical communities of Japan and Taiwan, and Princeton University. The HSC instrumentation and software were developed by the National Astronomical Observatory of Japan (NAOJ), the Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU), the University of Tokyo, the High Energy Accelerator Research Organization (KEK), the Academia Sinica Institute for Astronomy and Astrophysics in Taiwan (ASIAA), and Princeton University. Funding was contributed by the FIRST program from Japanese Cabinet Office, the Ministry of Education, Culture, Sports, Science and Technology (MEXT), the Japan Society for the Promotion of Science (JSPS), Japan Science and Technology Agency (JST), the Toray Science Foundation, NAOJ, Kavli IPMU, KEK, ASIAA, and Princeton University.
This research has made use of the NASA/IPAC Infrared Science Archive, which is funded by the National Aeronautics and Space Administration and operated by the California Institute of Technology.
This work utilizes resources supported by the National Science Foundation’s Major Research Instrumentation program, grant no. 1725729, as well as the University of Illinois at Urbana-Champaign.
We thank Dr S. Luo and Dr D. Mu at the National Center for Supercomputing Applications (NCSA) for their assistance with the GPU cluster used in this work. We thank Y. Shen for helpful discussion on the HST observations of the COSMOS field. We thank the anonymous referees for helpful comments. GM, YL, YL, and XL acknowledge support from the NCSA Faculty Fellowship, the NCSA Students Pushing Innovation (SPIN) programs, and the National Science Foundation (NSF) grant AST-230817.
The Pan-STARRS1 Surveys (PS1) have been made possible through contributions of the Institute for Astronomy, the University of Hawaii, the Pan-STARRS Project Office, the Max-Planck Society and its participating institutes, the Max Planck Institute for Astronomy, Heidelberg and the Max Planck Institute for Extraterrestrial Physics, Garching, The Johns Hopkins University, Durham University, the University of Edinburgh, Queen’s University Belfast, the Harvard-Smithsonian Center for Astrophysics, the Las Cumbres Observatory Global Telescope Network Incorporated, the National Central University of Taiwan, the Space Telescope Science Institute, the National Aeronautics and Space Administration under grant no. NNX08AR22G issued through the Planetary Science Division of the NASA Science Mission Directorate, the National Science Foundation under grant no. AST-1238877, the University of Maryland, and Eotvos Lorand University (ELTE), and the Los Alamos National Laboratory.
PY - 2023/11/1
Y1 - 2023/11/1
N2 - The next generation of wide-field deep astronomical surveys will deliver unprecedented amounts of images through the 2020s and beyond. As both the sensitivity and depth of observations increase, more blended sources will be detected. This reality can lead to measurement biases that contaminate key astronomical inferences. We implement new deep learning models available through Facebook AI Research's detectron2 repository to perform the simultaneous tasks of object identification, deblending, and classification on large multiband co-adds from the Hyper Suprime-Cam (HSC). We use existing detection/deblending codes and classification methods to train a suite of deep neural networks, including state-of-the-art transformers. Once trained, we find that transformers outperform traditional convolutional neural networks and are more robust to different contrast scalings. Transformers are able to detect and deblend objects closely matching the ground truth, achieving a median bounding box Intersection over Union of 0.99. Using high-quality class labels from the Hubble Space Telescope, we find that when classifying objects as either stars or galaxies, the best-performing networks can classify galaxies with near 100 per cent completeness and purity across the whole test sample and classify stars above 60 per cent completeness and 80 per cent purity out to HSC i-band magnitudes of 25 mag. This framework can be extended to other upcoming deep surveys such as the Legacy Survey of Space and Time and those with the Roman Space Telescope to enable fast source detection and measurement. Our code, deepdisc, is publicly available at https://github.com/grantmerz/deepdisc.
AB - The next generation of wide-field deep astronomical surveys will deliver unprecedented amounts of images through the 2020s and beyond. As both the sensitivity and depth of observations increase, more blended sources will be detected. This reality can lead to measurement biases that contaminate key astronomical inferences. We implement new deep learning models available through Facebook AI Research's detectron2 repository to perform the simultaneous tasks of object identification, deblending, and classification on large multiband co-adds from the Hyper Suprime-Cam (HSC). We use existing detection/deblending codes and classification methods to train a suite of deep neural networks, including state-of-the-art transformers. Once trained, we find that transformers outperform traditional convolutional neural networks and are more robust to different contrast scalings. Transformers are able to detect and deblend objects closely matching the ground truth, achieving a median bounding box Intersection over Union of 0.99. Using high-quality class labels from the Hubble Space Telescope, we find that when classifying objects as either stars or galaxies, the best-performing networks can classify galaxies with near 100 per cent completeness and purity across the whole test sample and classify stars above 60 per cent completeness and 80 per cent purity out to HSC i-band magnitudes of 25 mag. This framework can be extended to other upcoming deep surveys such as the Legacy Survey of Space and Time and those with the Roman Space Telescope to enable fast source detection and measurement. Our code, deepdisc, is publicly available at https://github.com/grantmerz/deepdisc.
KW - techniques: image processing
KW - stars: general
KW - galaxies: general
KW - methods: data analysis
UR - http://www.scopus.com/inward/record.url?scp=85174380629&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85174380629&partnerID=8YFLogxK
U2 - 10.1093/mnras/stad2785
DO - 10.1093/mnras/stad2785
M3 - Article
SN - 0035-8711
VL - 526
SP - 1122
EP - 1137
JO - Monthly Notices of the Royal Astronomical Society
JF - Monthly Notices of the Royal Astronomical Society
IS - 1
ER -