Shannon's mutual information measures the degree of mutual dependence between two random variables. Two related information functionals have also been developed in the literature: Multiinformation, a multivariate extension of mutual information; and lautum information, the Csiszár conjugate of mutual information. In this work, we define illum information, the multivariate extension of lautum information and the Csiszár conjugate of multiinformation. We provide operational interpretations of this functional, including in the problem of independence testing of a set of random variables. Further, we also provide informational characterizations of illum information such as the data processing inequality and the chain rule for distributions on tree-structured graphical models. Finally, as illustrative examples, we compute the illum information for Ising models and Gauss-Markov random fields.